var/home/core/zuul-output/0000755000175000017500000000000015154110456014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015154127021015467 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000370016315154126640020264 0ustar corecoreikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ>Eڤ펯_ˎ6_o#oVݏKf핷ox[o8W5]% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿/h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߿)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞJ|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kԥ~g <ʜ6 ;,9VPAHuŠկiw=m{> *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!f;嶑, }t&&\5u17\I@ 5O? ʴ(aPqPϟ' I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(𛴢Mf!s.1xvn 9 j1шk'L"cE=K]A(oQ۲6+ktwLzG,87^ 9H\yqū1)\(v8pHA"ΈGVp"c ?Z)hm.2;sl$瓴ӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆSܫ1z.x62%z].`Gn&*7b/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7\7ix@ D_P"~GijbɠM&HtpR:4Si גt&ngb9%islԃ)Hc`ebw|Ī Zg_0FRYeO:F)O>UD;;MY,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkJ7C~pۊ ~;ɰ@ՙ.rT?m0:;}d8ۈ ݨW>.[Vhi̒;̥_9$W!p.zu~9x۾vC;kN?WƟ+fx3SuKQqxST Ζ2%?T74a{N8;lr`$pZds=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3q_M]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ٟ5~' SuipA!C厐$&k7dmhz/#"݃,YqCL$ڲ`"MUbeT>Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?h`cZV yBjmBz-p^:ZYUv`Ƌ-v|u>r,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7:?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$ėczF_/ ld'N d7%y /=~HJ;-D13.v}U]kwL$i,*γ%2Vd[ jStKǪ#BRĕȲ(JnJr *J%!Eߎz|E UASVs5u'<*_jDƪ0%U8)D{QU)Q8iuAU=nF*6sR+^Gf1(h2n7nà utU!Gc~tLdzMj>zLD0^Lțiߌr(ɯ*xp!:%eXxJN蔔umtύ/;By22CӖfnd}^\9"z?7W'oy~ϋ+fҊL `~t`mL akp0\8FŠx(Ѥ5a|[q=hE^3Ows]|ٶ? ꕘi˗\XS!VA(l< w{[c/!OoyDc=7#OᘺlH//Qw[lq(lDIcvLfwU %Q ΟQ'ySXjG̉3A>PaX V,iq6*n0QiWo (,CkYe ֊2.Jό}"uygb!3dne;Y=fst#79jg$x͐~FA@uAǓ\m>ʄ#ak2Ĉ8kAfuɋGT-|^>lam~K|ikִ7N&3Q1(DHM,)^KbE\M ^TI@|r2]|R//.&U!:?wzE2\H9YuQ]լU5kW $*Wl$^铥&ݐR]'h=,?* ? Yw;̂b`6ު}8IPL% OnĽw?YYu\4]4%8sp%`䕜NN۟2C̓Ww?<)V<A~3m,ӧ3'hKD6=͍MqGM~>LK'G_Svq wYdЈ J<.:ݟoΚHr98hߓ&Cd=8EKSW[$ 0x)$,ǻ<×u贗#|/,zWzδw~:6L$nE`K:/* 8x](;LgwWsP|cje/ )P :3`p蛘ϳ"G>c,veSk Db*⥸࢐R@Nj%yU>\$)f$]}}9Z,؟,>8_08`F5û3uo\n›4B>JKml|RJ![B1<6~zO8l%$@>#1fw|W^],w\MsgF+dz,oޓaу.̀?=>Ή!g-/?e l] ժyK`8%}B*CT4 ?'9EI^ 8/*xZcO۔jȧFb ߕWRX)f B6ivo7v0Ii*}g$?[< HOixi Zc|YתO]67 Yn)1 C(7RQJpG-c $M''p]w!æEB ',Q5\d$Pv!Ȋc ܴ|¬$Pn}S,iӯ&dM-{o q{ʏ%i[6 &뷧1 qv޽W?-ץo =GybQ$}{DK;xpEK4Nu=8Ūhίd]`WGO* wZ;^ѐI^e|-t|*`-S\ӈof/j>GMC@dT8 y^khb͊2g`fW;y u o(U FH76>*+|q*&vt9nDMDU8>"w!Ui[y5;p 1L!i:骨V*Le9*ܙJm3}ZDFgWb9Ύ j_8OsAguhm}ezC.qww7\\I\NKM0FO[f]c%"w9,,H3g-.-㩬 Š\yxŒFA1CdVU8F]Atq.ڜ( ]C"TZfo{d>!p=U,3qռ MHkU!f>J[p"67 ~8Q6B?ϞiD;~:"e( rʃYt+hg:hjŠ ;j T1aG wꑐoH_n^ @)n]@4nH 1<=/%U: S*׆#+M)Yw+S+ ʳ"E2([ ,oZ`"6!܂o njI6ϳk q #yaWa#'} ha3wX]BCْ+XOҰD0À2ēZTύ06hLcwțJ^IY`d-yA@Jʟ<.hMt-SPb.iU=ę1mxF\U[dzoG#2)xP W5NȂeԇG=TUvx}7xhU\{Oƨ#:z4CTQП"/z"O laH8ɯ4NtW=ߕiTMڇZ*p9i>8a*E V`6 ̖nܕj_Ap2hեdT $u~Wx43Tt}ZD%lDs]yø*ON3CM}[f\[5WiFIS-6c &}= /KEN# #Vd vw\˛zx=kGʃ)UrEx\mE\{v6>bJXW@h .X$2LU,dUkM։R-Bi)ٻy4"2S%ilՎcbc1")W_[wq jv%R9!{bϘY-c_ȡ,6.ۦnCP|co }ʈmBK}9w 9WƂlUۣ>7խi C`?<,55ۊ7l -G ^ȂЋ|#x ׳4CӝȧLj&!fMtƂlզ t7+))u~WT$zU7 TCd+iwܘZrTy5TQUoq됖g* Ǝ␵ .r⨪kUICtBC*S>`ѩFՒiwGz]ɖFUت~Z4JJ| JW w8CUe'81ãK]Eb]UvT6Yj]Vj꯫n˴T[f FዼH ]G6U2 X5T4/}2\4I2DUEmGwoU3]\U8wV= "aIqoQt!T{ ^U9BEK# A,Qwqc.O3{ {tI5cpYw d3iB>eq[52EGeҺ)^q=D.a,2]شE/hTժ{؆{"UT):ʝ-Ѫ&n##YJ+R94v. ծ RU¬"O'ʘo{XH sr.K ϼ8FAZk-"_9ka5V[ 6N0C ŧ? mXL,qV$nP' @a)bI@rk`¤ObXy!5 3 m0ja >csF{ 7Cv[M%A!!t&S@mo0C!eGw\.ɻSQ/>_$ "w|V.$P 㟧鯯TWjZ=|~FQ&WURDu~ͪPt|AԳҍ}c:=lJCwC?L6 p)L9kUeZpRj'ހ~DE9bvރJ-j}z8+QYe ;B+?l"&"Wʁu6x':MZorXy^e+-ݛ ?OѹOGDH,4]g{ cyn~)x섳sW]vU29@1 kBڇ! +wjwP4sD?`ų$ ١].7Ho!.aU*@1:44w!ՏuTP?.5z8 P -S!U5@Y)a!uһ 1Z%w5Tѡpڴr1L -{~@rų}[PZ2P@t`)ن0#g5gba;(6vHDa`/)yv<9#( b0fmpm\#=  *ẅ؃>?90fɩ$E#'vP>Zt1:+ఽ.Q퐓1p~NH%: ?n8Nyq]zdeo-M]`dnky{b1tęIE>78jkBnsroNHХ_:]G]Ѿ' HŮo0ilXyEtt7vY7&o~8[&`[. = <k``l#A mNb,#Zv=Gh l۾„&-@PSPg,!v.E&N߂юDN(lvy"xC$]sh7(#~d { .  nDAW9N g[ފ+  ݙ? nw%ԅ0qx*(t'ہ v`-I}Y'c]SO#A{+`*:cr7g_sV.ˏVn4&xFl(vߧ2qOx|]L'h˚BUKP1fPXrxzy֣f1-.ЉA9''Xٗp53#<4zӤWzHX]9O'RǵO, kd ɇ/aw:HsЎD㑟 cr Iz=N"Z=F{*AYPy}|$N i/!=I_{In2]'lW;+OV+Wr!şyR2ɦ9d* lT!lQNHF՜pUe]^edsgs6j6Sr{ º0c@ ŀ^V[k:jf2cԋ<{icҠ|5hg0"TF#"^^#}wlDkLN+p}rttpKv194Xӫqu /}Lp4[3 ?C+A ж0萎CpܠUʟ|"o;9GK ח KR^\xj?è ) w9o5qS ㌘;гEiL8#`QK9܎{y?q:tm_H69Ol'e | "!+JiB J2FԅW!aYB"uT5)B'՟|PL]jK+eA/]t{¿Wϙ퇋[sƨg{Z03<\\=&?\bS,M}|C5<*SkZQ a/?RtrOx$ɱGϬB,%SQz1s<ǷU,`N)fXI؛dvâbo?Lmv"9D 5_ uQ<ОM tݐzlYg,6ؽbQo㣁죏AŒ!hi'OOs&QH_=7>C慂Ƨm0pyz~V~F4LApJί=<& h>~MeZ)a8rɰ>O}>arm8i̕YNMU!`#O 7`\ޑOۼؑC~MqȠiP,^qSRUWi?L#qa (`p}B&aC>NS`3Hۧ>1~ 0EҁKEFUݟ-7hk+=2Խz_7ꕝQ-5DmqKn]O 3q|7@׸^3w9ͼQ_U稇Ny'wkk ŵ[XpCL}gO]10@$7JߝU!mO=aL8]k!i;HcH,7C2Ft\u@ZcHQi؈4.nee\ffpOLo"A83TZcTG[ʱy iG"GSVAr[Fq 3[aW6Cv45Uf?_?KTmez ujݗ"EW>h_ ]q صu1]y>1ItϲH<@+_V,-߳cEbjojW\$z>:6key@ϵ=rsNat~,MɳTG=;llAeԽyKe?>\uFy=龪סtm*ZLQMGZ:v+4B gFJ* ]GE 8ʌ69iBTtk'Yjt]9KP*1xɅ2 8bNA*Ͱhq7a tEThæz/^4*[v*eeYWp^ߜ^8!-,YRe5SN˃R9g G|$ilcz\{vG+Y_ح"G?~"MO=_!ru E\, a3gbUq`#JYW0"ɦr9m}&؍{؁W- Ю/p}g $CZu4>d*!w˖ 3iO iT:Q ʖ%r"++|.Orb<6@|_ ]1t$ rOM ܀mz%ޙW -O_ߡH65 5^d!rPўr]jzXL-=tzGTu[L/i_RYu%-V [Dm+oi՚ ]k6gMt4O͆1K|zC|R%hV2&66fB?a[Z-O9,^Gu;)2hf/NM2$͑=Nj!(@d6,d/$+~AACP~U4J)ɿ]ߧ?Uowmm%Y@`12GhTS0أy,.Ϳo.c?|qiզ~+? 6lǿw\>ۨ9?mzo#*_R>@y_n8AxɐD5^`Rg1sI&gδ9ǐ黿h7vs2ܨM,Q?\`iH+:61ĸ>#VHfwSGz{$D/!,/ttvP ^Ƙ &$kLӮg1/Qw3?P$uF0hs=yJER%xxX3ؒ!eF0R7ȽiPQ潲ΉaZ " O1|DaUfvT6B`j-z se4)ǖ;~ĢL. KmKqD(ɀ8wdpp:(8 eswSqRZeZK·b\M*^sDLtW3 g+Jբ6Pe01)Hn["I8>| /z 5gLs)P 4).p gq s/]V!EI >g "BԏI8VsʂsL΃~&j= m.+lR6AY |o7ޏ6j*IKo,*zC,h$XWRԴHtq^dRdankiSBJ]"5!&G{a~dUKx9lketLpDSMtqww,8)Z-JGj R np)bsFE9'(> $>n`NCw.Iqh Lql^⒦z6IGB؏<K'B1m[G.[\#*TXY1&w_h E/πg/z-k& 8 ٻ)EIK.mm䨳V0|9f844 )X/ɥ-Bn%bLUԲr2,}&@_Dd10|xZ5~Pf).=|YplNtB/p}P^™Qr b!tNvN,O]輙v5ASEG h A.rYpTsa*L E35&@hBQ*^h8R$]~ʄ~ bum .u"$QHIOυG+&k+Yt7b"1&'mE4BC7ʁa8wкҺƩwAUK 5fIJou{_idhwMpdhgA0̭9|k<.V49̌0FY)Jw,A gM<{zibQ^UE97oLkb tr$P!$]Xd㚥8vZUoGo5u")d~nv{wЉ_Dq~Q?Dae,U}  ,xɩXߜ1Tx!BĠ;߭*TAiLv0smK hgƪz)i\yTdcd9o.JRL8}wNΨ1|-'cBp +EAN-_Pp6eO&)8ϡjQ-ejǯypyvTY'fp#9VK-C(c~ >߇4^П}?nwmF<Ae+jbddAb؉|Q2~]osZߏ귶Y %qy943~-&(OGtMxѢVXHz_Gܤ1΍E~Alrzb3E|]QNG-1f=f!9'uŠ0bk AeOG+UP\_ɛJ$ўD5Z#H:98Q&HVA/ ]ŧb.+fXH '~|bqrNC lФAZ#teU.OT#$ܮ6_JG ^)Z( ț/Ǐ~ĭ|4":bGbza`r\UlfBeeGth>v%["qsS,8em=L^xH(UeAJ") aQL u#H@"  mRn>O Flx r #By% xp`gdY>mkcSW%p;7)UfLIc 5 |81nvr2y/Lv}2Pg.>F17C]:NʶTNЛDgC*YHp7ߞ,ruv2}ѻ* N-bvIE#٬V_/X!'y'ȱB="FP.n;˂慡:k>?O6 39uruh͐܇8,b&inWF'`xܵ8KSp|oIN^7-9nĂdk]KJj@' )/1EKD>ԓ&BHĖBap|d SrQy % :?nѦ~yدFܲM8d(/okS1o ?c|)a6(`;lB汔gAVv rosGgmtJi8bܓjjEEe./)IB(8FժzKiU=vt%TaKKa&DNܴVzdλKm:bY,ZV2x|qB}l)'2$q2 vS19v4!O/umD"r1v' x&Ql<%~D͹ <9/I>Xj2'C> P\ .Pq"nUK[&_ n.*z^&Vɲ,UQL̕UȇW&pv |d>cYTlj0t2Kd ɴXR#ZԪ/&&Y |,NҲ ]5_JaY$K%]'MiC7Z 3s sEXC=ێ=Eo܇,f~2YnH 1z:9UR(HYA^mҘ>`l?gvP&.CM4b¸jMǀG&HbNn(S/!&^ 7rkφoإ1%)KTT=t81t9ՈIO,8d>qt$w["ϒ(6”e#Xłd>kp$qDZ'rv]deԢ%gz fDOe4$4i)*oFbvww# / WZAyodv]^I/5]o,8Z5gcM|DR5,0L>Rxlk;uS[ˤ:zY<(ȒN!mkCEDM#H׈g's^;{֘D寚졓}8ycvZ(ޮJ{{y߿9[T.J\̺ <3mk1y-bRgV1qױR+I{Ç&>UEB^XPB~:2=rroȢ~pZtHFJ D98boE=~Ȃ6;(c<GH6xeʹQj]IW7,Q,XHK3 7M^M醏"y@9@!Ʀ A ŅV.AIkzܯgւ#Hh$XT"i1 ELr oi{:ޣUZN`],潼 Yxkw LOiP 5Zk=H7 /ҮQh`p.tUP3C A ff84ȇ_2\#S6jS kswϓݩn9鼇_F1]zÿhK={wڈZU[7nFצ4TCxb$%7gfIusw{{R?]bD OQ:\i]/榛Tz/|'su/|1&٪((a%CS\eK9|xKq(nR$~K(% Jr=}sm5T\?77ëATzdL%0u]h}3̧̮dxo0>;O)^gp%][EuXeyYEєO$rQUxue(ʼ(tm tE'f^Mq޹o㣣vX0FӅ()qߟD :p \ OP8 'dTbD=Tl8?a%X?!_{W&U mUW2fv!׍^9U|^VV";L/gFiҝV*UWXުnx`73@D픪i+vW`fg#k?qDr|˱"87 wV:qп.9ILy;~iř 4jvV.bw̚+~_0q3oB7<*O"3|t/7M~-ͱSJޗ#+8WXE`M]t=:JNJļ_z&A߲.#<{*!9S"8ql/e0R;w\ ̕t7h56eB sW?cvcW'Ofpј6dp#HvL\D_TZ/Mz3UƯ]5͐##Nչƾ˿/8fg,O'(\pXc1}GkGr\J8{fـS3Hdi'| /Vi䟝 tsah+`|=DC.s]͟ݥX qNGUY̕_ݎ>~$t0'~/ 𘰝ϛqRZ7)aɛ%rkwx }4M{L[/cJGaFV@Hc3ė0{*[eM阧8xk& CBh \?}tVuk=FT P'V i H %m7h.)´E; fVZ"R^3hq#sMh ry,/[}Hl60--qXc&Vv9`h" (nG#jWzZb[ /mMBHpIFf4lFhElP iP1vV g$DF2 5OXAkM Y3/!TO~%({1.J- FErʲtB ^\~,$8ÂHmHA*%q6-[PUt6!ERB ׿ Q>%Tԁ3Q*!- x^X~,e%S-p y*;m{ɾ{T.i<Ӌ Q<XKVaBFϢqL7]鿜wVPv4:vD*de~u#3i%kb y c{#' {)Awcl|m|5ln_pNeIy<:!IaeN+%#s7Qy<: +=ś85X1Ɠ&wc.7iB|",f-M%RaexPQC{Njx8RC{^{O̴ML:+&_ UZmnXR}b 4*"bs~<6o`@E274~W xn`;ŕi7hY*m%I6ǙWmkm Ů YDH19ĥ$'+?!C&$>{ʎ W ;/AW$EL :2bQ3t =vwm\Q<rC˭~ZF돯9E@5 ~By# )]ZeP0''4= Dv4Jٚ}Ӣ ٳgVMp͝lN_'- ݭugvӈp;zӸi$)hC`^W{ pMk^=WE rAӶ5?7ﳆ]\ ܯӷ֬Cjܧ]C:0`+y?9+W@#Q[͑x3ִk5Ufcb/nD/%oEmaԂ:'Zn?SQj+߼E/CڴC{"XgDmټՄogElJ E~[ Xq綟á9BR~3Gknb,@&:񙷕2?no1J^™N;, _hPe?y!r,1B8c0蹰)$1FbDK'1\a2)GV0eu2;F_〳YM;*MM3.v#p|; >#W;-Tw|.fS=??قB۶eu^WHIG#4P ߨp1˼LK"5x/,M;j$M;_3p6wTζ.fsu1\sde DJϱ֜xʤq-7Dv" `PYӎYwӜ :n2i`[RP}y9jȍ_tz^V7%nFw %# / ƬowWAHl̨ScZ\Ih.G5|#y[ZqX쯟/^EtMQ,]B۠jǥvBC7·'oJR 4J,aK@;a/zBF ۿdV󵁢~}Ch'n{~&?}w<'u%xnENr7wڇP+M=Η2jb9"5''Z!|i-LOϑ`QV /,I`gbqi}TR0:NTO9Zg!X8 ;ɂEXڱ y2$BGD+8 XhXnr^.z for[Zj$-y:A52}ήyJB+=Q?_3ufZ7., Ǣz!?&C_ =}ovRyypq.όD}fLJAb_a& _Iٗ%G_a04ia18S065*| &߀ Au02dJIZrmֻ NM,G-#k D∼詉%MMX1im2' 9HZG6]>Gl_Jݜ'-fr$d\M2ď{Tlf"L58_nfݕ ^1OFRs㖏%D 8hj?W폛/v38劌ܕGV+{ J;=/hD*M:jDROu)v::dH(j-! PQYO61jp+jP&@VPcPR$ 8Z.qYRkp^.zph~q*(22bPJ$1R5 &zĵ(",ZC"VJ& Lcws1E-2@0!ɕ'rjBB`*fܑh OVwBlnkؗ&8I_Sqtm@.m!ea RuˑKvV,H(B4-CvݥX~a~o#~dX_%x;--ץog%k2l&[ʴi]H4XVMԒ0 B L>\ž|\>*HFDK<<<Lxj5ymi@ -G&I,u/ uRBK8(Fo=L--OaTⱰ hTR GjVPF#Ö́WZB{)Le 5̄(`q$ʆ\0Ih:N`-9bڔR)kI 1bMD#] ?=[]\ے ℈حDEu($a8\F+HxgJ{Ƒ_a`l7C,0o&m̉`'AJݙ[ԶNwݴؑ%aUXEbXpG,qDP@9bR{pu{T|-z/%|d"a(c 00g%x g&"IzLEFM ԂW"  x$H> SBSSu*M؏ <':fAic%Rh+!u $Q Q-4C,ü<ʠM(l39ŭ6m}M%mϴɵPiA " B"NZ$ԭZKkSHRi S(ʺ ,k*#9;^/qFܔbǽ |ˋ{]{A(.&.Q~Yn\&Hppb,Tv .QbeXRI2 CRNڗI"p,}s=.hqO{$7,QLԨF*Nň6VP4Ke>ҀF;m^qc5m@۠8Ċb+&[ϫZAd]e& C! Q sjLͰN6VR.iTvfM+b\*Qā&xslV /gﳢX[5%gPb)8R=ETEDh:I DdHƴ1KF5ILlݘ'RiJab=DM.Xqj D811|ݒDI*yBX&Ig ywȔ&HR"T1LF;q nR d "D@GMI+&UZ`M}.r<{&Z=DR N~ܦe^3Ij}Ia8ĈH!Nw .mQc -G`=L D Mʢ)ΣD$JS߅TYf,!RXYJN9옥ܶ+Dj4U&1JD-=oJ>bwSmsm $R&u)" RU[}JZo%ƬECV5CKI##S ACLX-^sI)ݕU4+äU5lg/}r2W([Gte m^.iЋ^)wr'Tz|UqX0Y\ARDf.ˍ# BV?lH&Yi}8aȗ]Sc[N)WG˅@N^hHv]8e-X8֌?ɿwOb5׬F8)~ & t,e7ogR30)5LG88U׍dDxDt"GKԴbes_Yٗ˟_6o"zoJc}:ЉWQb7K?AH)< -  ~a`zyJX#y@H&UG+Pc!t !wp`0?^n)“u'GUZC2߉}ƣn\,lW,I0zyMg/) 1z3P~JZ3P*FWˎYB*A_/.˧?'7S0?xmX|bOhV>^S6̿|[6"_%\̃w_ d}a\_䭄9Q&w ґ# Y,W6<0zbCafBɓPPc9:@#Ogm#o1N_PeYAs3zXv>2*iz']QWmt|=āB*ٟ(ؓe JGӠۏ]Ӟ\{nsk_ߔXVqt d_c-M# κJߎn<)+Uj}T9UG)׷{EۏY@{CjaKt(M4 ^z$dv =H-<y7G߯{"__7.h,ؾB&3)6{1q_ϭ]6aņ6|=.WGMQ:-q y^*nut _&bu*mAb7 hjڍFYV$Th`' `T.lD3xXƿݥCq+$3Mt-q L9<]:fw7oZ3)ՓJ1o\gd),Nij-&#VIսȐBm?X'5;M@R*MINXDF6;v#$Wk)tPmWa v5w#w#=q9Vr"焖JM%.ZP3;.ZJIu9\4י_ Wrk񣶈 49@`$dSN`֯MW{^uA>e PJ&Φl AŜ\氚(R? |WE'T2OG `r͕y@oiV =r8fs\]#9*;Au&L*K]O5u\6ޠ6 mѴdKԚi 22-Y~iLztN D+g! -0g0bEC2ۢQXVIE E*NS1@nyt'y4>DTPIM4.Ӥ3(ל"]oc[o{QxCOj0js.1!ڜmד@Vӓn#>?P.NzK7j(@I нB{ 9aUmb'~0 {8S<ꀕ5 怠+(6k"Wbmb$kcb(cMbǿDcuOM]WgMd+~~- 8'Kt5S6T1Kh a|Q_忓hI׆sSb>4_)_CuC |[V.gVp_k_uXTEJl27]]t>k˕Mõ\EyaD)swh(T`L?Gh䩭MJ[Lq?m9c}AFf,E i~rB=]ob| {ez[=)Ì!:5 lY\_06`{ .]BO䬪iENE!+D\o"vjfcyJ]x;t;q7^I&="SnM'(vLSp(F7F2 ̲"p}MTt=@~N,FY_"d0ɈLh)$(YfHh1ckiDFL$K%æ ~ M:4)< kؤXofqnG"L8z]C(x#׌n|  fG?hoVX;[Ps:iAwEC}#v(@9Pn|Pޞލ wՄ}OMɉl95we)DMLʒ8ERƙ"H *$\4q @ӤA',ȃVj,`&tk+pkeOmңS5sk&g6; w"z&#C\o*»T;Sa8cޏNVpR k¨! jӛRv`2AXT;DŽ#\.Y^ۅ ٥ ŭz?Uۦ L(DKnp5DhO l%`+HJqKjf.SFD՚Xь3:蝹B;f)fSrE:K[*_V3 ^$bz+?ܒ) XRB.+?tgzbBhAKWR: +3t_ϰ>Z{^vjjh=y٣#&ww;ZJGYͣL~rz* X0+dWr㮸q ִ`-<[NpY놶o\ao>DWR>{Gn5Rx;7?be1Hŋ!+ɓ.u*un5I-iV!WALgͼC+VivrhP)_ѷ/ }0#$mxm(aL6`;~^qyF1ZTڮVut**U_1? : Av? q -ĝKs;jšU6Rqv)1='GZOP-RH mg WwD:𽯫EAMU9Իj{J=W];tV[?t^a:`X(LtGf5؇#f6NZ2kvfI V]ڽ3"/ٮRQeb fy 툗tw ;o`h-lyfqN5ZPb=FߞD- ژڈP{]x:v$y p׭_ Q֋gȌ&3H!ӬfJL|J3{Y#!Ӝ::BtJ %d:)D6NduEȆOdiHv5àaycp_:`>q+|k-ZX$,I <|@,Ae7|ꏿnYtdۓۍ~\krKRt{às=&͞KڏKڹU?-s;j^H< |T|ݥG~@ApsqҾGA. [9V4\GZ$L6~-ؼZh^GNjg$ $҇F:!6a +B.V!g+Bof\{tVE Q?f׹^0%*t{;!kؤ=fz,k#J̙͜GVgy55sfucPWޖlbnE,k5s"( ,bm\m/|WzroWn={[:E?o8jYN[6Yu{ww!X,}rí3z:s1f7י#.n}MavP0||u)Dw1Wn_Ȇǣ:3z?=|/W<=kZMԷz>=%Y#Bd#Gt;9ߊb"Xc?_5mL}˽M>/_8V;[ŻgUs}u }jTa $ `x#٨؄N\ Ti\,Ae *wJ)maxt^`h-"UwR-)Q: l]>2G 4v!LE H H1@Rk!<`Q^>2+noq{OL f$S\q{{znΗyL"Fy?^z{@יU#RGR /UgHz{^3Fe $'u4[z]Գ(AG~լD8=e5S@\ٍTzR͏J5 x*TcToԃ&83QLu2{l 䤬#u\G&XQg.Ř)."-"=ID:.R"h&%d e ,!;8tUp]žl WUoW}~?2/yT O9 p`$2n.",+ң+(:3WHObe@a-#7Ը7H0%bH6CYCיPLs`O2jӤ?e$sM6;6A&\V*>s!&CUt1 _8~Dh<I:Iٞb@Uŀ?VdX25!i6(а+ههX*.ه%9dAV@OCҢⴑ|8"֔,GT]} G^Α7@|la饭'!OFԿH\ a%ɇ7Xx3KϛGqIX)6 oONʶJx;Si\@4N 0/4Dȕ`$rեhP5-,L=~ak=`Kk`4m#<svhOG)+k:u-B9J/,(bĪxHeH'hD> boVemWPc[+J[\6Fo#x 3a#;Z[YDJ̃mX:Ė1.)%`$vLfݓXu89$=EdN8</RVkd#sr.{ qj}auEXEHL]EШx;=%FaGQ2=ņKgv'1xzevɕ-OÔl$%ϲۊKyD<kgG IF缄uʍ{e:BiFrC0Dp3p-.8?Z8f^2jfUC4XҮھ%J<Ě{k{-̍:Km3178 X{eI^`$@dU{UT@/u4ů(1$bO, F8B\EmqlBg!xaE%Yd$ Ϭ6RcGU[V|Yc>Hpzڡ3`P `$`.HU;@d9 "V=` m@U2i#@zV%@Gz_E-0q:Y]tI# DG-@H d dB.tMBElI05:vEA1q'E "OÍ汦Y[=H6"KTF/r}d&{"o6~.[8 v%V"FR`NhM2Suı00%-,!d lxZs1!A F2e=EBMo?{W不dBݖ Q3g<*nD(uw} %QGE&/_ u \ˑ҂ )F5VuxkƒXZ7rMԺMKנ@!ĨRY,  !5@є]//*F J=vxV}$8d(Sjb\^ELׅItu«.\Ņ0oM^>µЊ:.qaM9EW‰\ yM0# 4ZK%;_r Zn,,)x & |m285D^/W#Peg]f]"{^wFҪƾ:m\Ci#e:0o׃Jb9/ VLDHDF4Q0(QE$6O1R M_S0FuHZ#EH-X ]/(Ihqej㜄⶞p[IR* q5]o1XD֡qڕGsh.bD5_Li\ǡ7r1MA6BsK7c) Qlkb ~bIh|7BJm,%㐐ꕪ:GZ`'hr\YA_+3֕7rmWD`%7x!F.ƕ9븠x!`X;g5KRM}w"G V9gso~fFGE !.' jvǖ٥lF\E~xedo;?I0ۛI D1ňry`_[TC*1Q:kdL Ajx6,L'7;V =,~ wȕuN٧BxkCF`< 6QW`Inww ekbq1$p P D?En,)UmK'm{Di+/Í<*0 bܞulǿxudGYln; Y'Pak ݂O?c!&A?tw߃4ӱq83/࡛AB[c)0 A^`ܽ>S`ֈ`m1%znVޞɻF/׷huye=>(W۔p ~VOV. *a3z@ZƖ0qgH5ƂqފBSΖuF-(R,;0ʖWWpAހ=Qs1}G0$BŖ ] #}o~;v+c#*!>-jisߧCpG#dac Sp``:y8޻_1ͭa i0H|WsMRʅC > wN! "zI:[=*:,tz`#lNz0v5IzUY>ƀzaz9(q}'l~VzX6b}Dx.w/R4ڃCgV&` FhHo\>.4`Zˏ{#pCz`7>aNG^^:lHo0#E6<*f!zvуPY奥o!1y~P1`<5nF1#طݔU-A&; 7[s/woE|+\[{}_:yuQ)M {%EԘ5VH%Aml1ٺX  TRpVTr1&}á^ZZTRg8I (FI 3|W9>Ϫ\hvN#"!RrDL*"Y$= Xb"l5ʹd!BBr~&)PoI(!$r{lx-=c!U$oۡG4[|lXJ64Ήh.uE731F[O-Uf"$=06$F-ƞB#gJV#z෵ɜsM&t Җx)p8+38@Aq%UnL"AqN[ث%XrlKm7 1(_B4(*[D;.:г-2U{q:*H=48JP4q\ %-%$ȷ}k9qbPa艤"A,!Ic(8@B$V<01$\['Vru_Q%a7)%҈ǚ+-5iAkd(TDԛE"mwCr rZIjcs008H *Ihs|S-^|(F^ GDCƷ>R[Mͯ*iJڮ\%HCRl]ܠ4vVI~]%mWIUvi)6Xvj`Ո1ؖǶ<'cA$-^n~00D.ϐ&Q~ BRE1$+fCr?3\ҷshRpmtp'Ӭ!7b/'>. &K7 D.i5Fo0@sǞǩ=[n/>zn5Ns_XdR^UB<'7ˎ 0 3tGNBL>N%qV8hAJ:VѰkfcv=o`G-[ǮP҅?Ĉ=Puk@|n^7W·fkyYJt MOsޫ[gӳ}ƈG# 2XrWuKm~i`]ue݅ Y0 ^è.H>Y2Mdl"v%6av).;/i$`<N?NQ@+xb,nt'+HZBO^}Ӆ>vjo721q'8L ER`RKH]CB5Qσ\ܲwnckn.C@[NJ#o*35S 'mH mE߻QP0$UkИT ".BAY`[yP_]x׽o0|;m~f3{hWg0Qho tXy[6:P|_mRAk<=7p䭍@cR90f*(hѡкw<~8~_=@KܵYC/Jxu c&0 0 A&qơJ2QD`HE|OV/;6㮏g3|Â)yHB.ţ"I  f Z+&0B'ZFI >aB%ɧa aV`N& Jv.iLbrecL]vM^V ט$~y\GBȆFTH1K`L:5٤&\G1-rs19 \=7fݜ֔CI ސ{_k}| O.:' Ho?gQtl:w׽iuh~m[>n!勎nȦǧӀӕsio;:9u&'yH8 cDe11G Fq% q^k^cB|gdBM*la5eK2Z.Y_cQQ;@7Nw}:- EY ږQ`?"kg^޽X5a|TR4({ \f8 ~& W%2c[L `JG#C`SȄ_%1Wwv;>O/C%I|V ^ҹ&^~sm.e^- *i„t$GDӢf3[3И@t#11/y ,oUOZvč/nU@6y[k}TqwŽXִuDŽa>ݓ>=bHBCPSr7{lbml jr}I.b'[K,uW҆TMi]N8^< >Z0u/N#x#ᓇ i|e/㓗C)U\CԄTvK  %G$O>Zq QrϸoO#sB+EF"4Y w:̕zҲ=T_ܩKU/,k\dE狾̒PRs-ǰIk ~F/1)i'$Y'n3 JwIm-c 5wVQ܆F) _-+F~P4brT%mM!ׇꆦ" tu<W=_C쳁};PoA貉wMah\z 3]Z` |OӨD|gɂ)3`gEw8X\%O 9TL0l*PϒFݵo:ô+˰S ~度܎Lݖb.$ CFY.>X|0C[ FV'*'Д=LzXv(3IN4_Ɏ@4~+=NHA(ϖɢ:YrYBk.Qz5t^ l7S\ ?i)ǛE }bc]C0xKdxFUR4;4``I=Tn]u[6pu036{Fe]3 P(G =2|.cG>Al򋫘K\p(4QںHYUx~GwRT[t>f {AwqV4q$!`` Idl&Z=.au]lWF5!Nat?ߡlwWM7/?ٿIVޝணTg@WSf-N~'}_^ԛ {/N_xhܼL<@t > &?"羙~?}PN1gAt|P2#k{w u?!SGOo_<;ۂOw"D^t||gG{A8Ye??/c>y=sv3{'V['~_Ǽ3_z8ݻiݼH>$j6B񮧦FzĽxL3j#qi 23?AB?mޟiVG:Gtz?GN2w6SVGQx/{B}@1Z|VPщl lJ4#L03]O7{HA|>B{W!^ƛ|#\:j{utzQOwp9Z"e5[Wߒnv*(F&0IЊHb\)&&_*!9BRƨXJS*:9Ul{z )KaW*8`qul*KQd\sd4EҞh]VD'Oւ^2dtF JQkdZR &I%mgѢi} =?DrIXנH9.Id:RcU12Xb1 3im $ۀ1)JcYUZN1*늳OHDS_NXxkd*T-҆4g ( tAt(5U0 #)gO1Қw-";kU[zxx(t]0H < iD I<^Xo.bd坏t+mْC Q%Cb}Je5s]ֆYGU&jjlu9?UE)ɑ z[5j"ɵ@)jE;:>dsKHZ7[b'RArce_G&CZVqm, VٔtN28ia~Ɉ9OHj MŁ ςx"KfѺMƳ*.zhC֪)[Aof\ ;|3,HXf*P4-KF Ѩ +A7W*KIP2Le*:˄wh\ f).(<%nEE FG90N=8 /5p;k<ʸErT!V&[zSdIbPn X"Bc)lNs`1'ʺh6 Mw†BXL%eVeS}`guʳ)q(JƬE5,@mx_uFdPK We"@plQ<ߋ[lC;!oܤe*U#RbbXfȄWfLaIHVGQꫢH$0Mr˴S@ƒb2 fYFoj(`B]aDQAޠ|ͩ~#W0c2[o#BҒ` FHCceU:}n:A(,XYC3a(*"c*drAI!0΄="T+* lkL&4/ vR A3d*3R`(qH(ʨq`j mپ7QP V*F&ueJsϦ.1Q^ud$E>\,88ۖP F^>tePH/'Bi(җ-hg%7іAB˥{ wS yE n8/^,$tB\Xː*|"2D&jZaT^1 a4׊.̃e>Z6 #2Arѡ1(2XpGWC`ܢLtkUϳ5 V1DUya[&K ]V3#Hb b~|g瘷ǝGUEq>e->N}}6Fx2=\LA毁XT@@P)lOH_.$#tKIES vК@nsp1B2v29*չcAXx +*dWXY3JDXj5YZb ^;=$@O"}R<$W}ݸy6h)Ie'J!2@?A. jCCX(" 4\- 4DMET qX516ɉ,`k.X3,vNjd ,N M*n$[Qy/g,XP JX  HR y+8z`Rm'<p-@ӣE<;n^n/!ڮL b@š4&ZB63Q\CH#=4PQ$(u\T4k̺k>n9BGBjсK6#YI7]oZfu(%<%*` r)1j=#F>̢U.*&R ;hA@9$TiA'5)7 +P}V.gcY|Gv XQW4%.bC'W WciϕE'yˈnU žcw((Cc4* tT#!ssC櫞uKE@*mJ6*\T&Acs'U@MZ)XV=PI']=(煖+Q9듵OH&EG-k /;[i}à>|7)"H\Bm;*C Y 6R ZxE RriM8qԋ!+t4S"q&dP@jOV3x((q˾ %&.ݹn5  }U?\m1psu[/Wmı{oZ;]R;[ø {9_ ~_Z1:6Ԡz]Br<; Q1ŊMN'COiwl7>y;Vmoz4loo!uݿMؓ;C=gpn)~(2Z ݊PjP}~(4~D?T9 ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?ԗ퇲bU'_V7: (KCA@?P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P_*58zPV (="l?% ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?ԗz`_d2-{[olOrs;>nؑCLvu3u^ڞoNTp6uJ>- At%ꕀ՚ +k4Y+hڠVV[MSv--뵣'J{ܕ:6Dگ1Z X6F)Z X1}J8uFح%zcq+SE<9Fj-+KyVV~n%`FJ @bk`b]7`i6ofmg,3O>l9e XJ2{ 07~dODi[K Bb7 `X Z XR+®1'nVDr+kŚg }:X]Kد% x[n㾦VJz%`ؕ5Ve 1[`RsVÜ0 }vSw-_mnvonb~ߧBO ? Sܭ?u^֎Mη6v}$KUX)HbP,}:GO:I55Nڕ y77s-답㓙.!\ȽqכO//;s?|J+w{!ǣzQ.}W{Wmj>W /ʇ\b;!*K4II[uCr4F}1 3 Fwcjn5wߦ%<_4+stU6dvO"|t*p5VD9 8xlD0/~z搈'~%WE!rcGV2L꟯u/f@f92TۯZEUe`g^烘huYJK$~ \6Q)U8:jI:Jm$9x^Vs,\Pr&Rϼ\\L˥0\` L`%Ϊ\\gigNЅհg2X"qNoq9V ;vf*CKUr?ha=h${(b\j5`riV5/gj3\Y\ja][\/9sԟU~.*ƬO3o!-Va\g 1Ketf@xiu.a_jăCr5  ףYuqR|<xЮk<f>,yOhH y\tɛbL/ꁬܿ7H.sYq8 dn7W0{soq#&~9WʈUja'MWj +"։Wtwد]^4HR/^E^Td#=ⵇ*%th.]x>`1A CF8%8ZS+kuP^)'RH>dZi SKm'^֑ב-\Wc!nDƆrMSU$J?_ꢲPW|Z}E<_W" q r sJNr_`P7;[YHS+<^\4i[t`tӇIRo%Df;WOK[?_vT~1#`#֒9Rh92~Z=tIoJ~6F)1'4Gߦwx$gtoSl7rR'rs%5.7 Q|jT*Tkk.6}=*~~zSUb{T~BL:`z5 䢶jITЧ+O /x9{]t{V說ҀF@;}PJzd3QUM/;j1&SI'?]zA_7ZOV/\~n6u(Z\>˛}Tk+a$+}XqCCdyiR>v7d~JVSճi4-`xsp.!?jpj<'}*'{ay;L`ӯ͏w>gṢ5;[i  fO/Ae6FMWs %V x]BltDаJz|m;ZRf#dσ_^T2H ̉kD?vkzX]mrPd@xύ; ﲫ;Tu~@߽^ HFr Ri>qemf"aڶ3nHx5MW.4MۅO^O^O^O^OO2"9MǺ3g! %0hrZ$ -.L63T{3ΗM΁>b5U'(ŨP9@F Km@1xFzzajF14986yFʨnTu._hS}Q`2/S{lM*zh6,'ŶA_LW5MPwg$b绊un9Xi6J3JJe`skHd"销%47Hъprjܙp;I[cϼs}_mfvG[weVUsoVO@ְ5Wab䲑tl&D$%B~DY_GxKÅtH?ulgRx4-' xkA s:&_n:Yő = /'suz"T_n}ʮB1|?8vٓӜõ(8E-DlO#-a㋵ȨF_7䊉A<HS5!!T@[4 1nJ3t %25 r?Ň_cc;;V?5pcEzKԠ]:^ E; uM*? ^z!^3]"DB6:u[PKn`\ּ+[.q3 _| ^XXLhlwhKے'ؖ5ҽarY%h9}'勳KJ Kxq3k ߀\pAE~ym\ ? pv2xЂsHE.rD*}ԐhB^Լ+d4l씆e͔ k%ls ˚wYa.w_ RD%ܢHs*J\hSwEW]ZH (%EѬT`[ ) &PQSْȖ5'[NުEc aHR Ya"DiЁ>6h@^ּ#"lj Z0s L[":ĔmWx$<\wYdP Rn 㴲mZҼ+nN F&3#hc&cGY@o-yoYRڟ#{ N(B"xDQ%lN@Oe[ԭyW錒-s)PgZ %ʄ4_DF]!uxYMQÿce%4 (&e9 XHÉ5eͻڣAs?ܶ|*FĒm~UOyЁ>j7]yWii9HLZ0L*8(yi53;4ӯ- BA8rg8Z0513'4=\QsMWԼ+亿S*؟xZb.EQ-?0-#;҇r:W_y5[2B 7` 5 裆@F]!?'CnH$;x=0Z,sVbw1YDF]!WA>#=jC{>6&zL5sBMDdh4IՒ5*ϛ4mN3 6822"a cRK n_] it5 97#A;eA(ANsCB5S.11E;BpFtlHIbxp')E($e8K:RQ8sP45s!;-ʹ [ @ !S/V}ԮMWhQ eC=O8 3N bD%OKʺ)I&%)j|:|>ݾ;r>l?ފ`d49b%6BkS๦)SԼ+RȮhfqj.x.|!ȞQk0ӸDžusIN5֧k?~lqa{9Bcc! a=h桅fSp:!,xu3=`^cr[QSM RԼ+/n&(1sNATIFupL ˚wx@$w4ؿ.JAmrG=*5 9^vNRjg#ؽ!`2`c)/-cKn 5 r+!VIO4jH9E?$'XBLeC-`dc!H$Ĵ+QPWԼ+t]w7ka ĖcZО}>v,!?y1\%pFZb0,Xvh֡Al_1ɦyGl9o> t>7]@غ} 8NK}:2yxhEpּ?-=x$†$K`c`ih6bq"r @ۘإڬ)xi5 1CI(ʾM`•6GTG76z?V*%v 1Misg%w,e0;]o ZP} 8 6xD[AIK,V#M.l- jq6="T_gѯkċ8 wAOmW+i9 {h鋒](w>t)T0}Oc .Y%iJ^鳀(\ir1L,Pl]-~9׶eCW#˻IyOybEz9drP [!_amd֮ 1^9cW3Leɞo.Q"IH9  A^$R&-g>j6^ Ƌw^ӁË4GL{OnM}C*q,2`!QktPq_~tKqKwEW.f~\>&K* ~erOO9cQFЬxe[4 o(?@l@eZV&k1Ol'_q ȢK1[܍bQTgb17cOp"Sۊ*(47M1i0&G%ܑ u0ܔ[Zp1uVrIƄkh-yn1s{.׻ocp>{oQlGHj#^# 9^暮w]yGiǂO|[ݸhx]E>T#le,%Z%TB4& TxO:~p^;^y+WvI+SA"Gc^fw ᐗb:gW"dEats o@R<{%)~ fUs%9Jwĵ"ȨK"3S?iSv6|KYZtAA0E?T(ETPOYOOER߿BD oJϡpxW G џ/eެ"J6 <Q .6I(ͩ+{mѢȇB/v Py; ҎZ}g|B}oltRGn&$"F/WV욟o t 'Lq%A ||fDGnApx ˚|{";.*>?> DgD1eӄc1Z,f%7ĤU-ctGT$B;6CةCud '..CZ$0-7*Cw{ƘbLEAeu^$K2]U}Նd$q:Vl`ዤ l[Bc{ B{1[ءocC>T#fNT@E(* 0GLcem+*.nj2^R"2 %&g[>A{YOUCEeHޖ}TK d[t@'ސqgjYAvs䐮%,Aw͒L?O9jUN=6ѶT] q>B.pEr[՜7M#{|yAܟ"N'/K&O<~y_~)w4o')QCxZ?نǵjOn}^z + 5]$e]?8J $l6iTϙ^/?/w0lpv{~(>/:mco)=¢y'n/~Yez8.qr 5o W'úhdY-\*|^QFW3r_~*T"5mџݚ[2B_-1f?p=LFZRR{<2RyzTďl_)eOڬqqaT?˫i {o{@E|>zm[ʺզ7Ƕ=K.FnuvmV;:y>/8ϡyR*;V.&˗w\G\O1_壼7 ֡Ww(lEat(YsX)H|Qc4,߶IB6e}k{VM.>Bxoon( {ÿTҺE?-IoLY`-&n 11:㨛C{WO>љ,*x{SoMan1&|Yb9K?D!"u10F?Ǘ>i 3NY0<8}׌d}Dl nXx=b g.E!I(aĕJ9c^mȇJן0< ع0C68}i_E%<`{r"#-;ι깄C3P]s\7p2b }gTQOtWY+UˌV2G]NLeJpeɶiZMNd7Q.{Sr 8Cv|f]!=G+[|19Zus{ҟn힤ɷ)8w4eJNTT D۹bi0ĝXl+N?_;xk./p%Fsp[3,ũ 3G!ix(RtG/a*{|c+2fY58Zќ"h\jwAQcI {}YzĢrO0_PQoqA?"0Eȴ[hp{mլ*0pW<ˉ3`5 (Z`rpߏ(8l'+ k[(`p*q*&|/do/EcxF("|IMN Xİ(xD°囄H #(#Y4&(vw[ZEM;mdCg*uo)b»w;.I>U?m﯊\::p?[3xcA#&z]`<*na)+BHbv/x QlA@}@ӧoDE>p4kǬnǑ#끉0$%ILjAAFAmf,c%8 $X0xbEB[D,@%̫mE>Tx&2X e0\N(:#Dʐ+P_ R >bV.[ciIG>Pህԗ܂-^L;| ī0d K# PB ukØ~|Jm~-(I P w%*-4vCn\Ct` JTKW͎;5ހ 4PƽvGL;C<1G$LPsT(>@xMT"E:ީ*nQc흪װwBͪОޤEL2ON^zT'&m?P9?{~U 1v^VݸƼ"=-*R<>%կbؾ+XCŀH)&xcPv ^Rk4"FݸƜS,& e^cgO&:b]ZhҮ.s($wo(215 Q )6)|ؕ#i.W8M&MI r#$\dyW`0f|BSj9⢄Ч%$IH zocrleeQC[wg[|AGȑsCPz2a4W,pqem)oNSÿbnᄅ d cCP"BY8_A3i<ٱ`^m/(sAX/nZNK^L @*:؃~U&&wh:b2 Yp_rWE2 U/E_0T۽ob U7q0.x,m.]Cܙ.wЁ+ܴFA*w ^{G\騚Ơ7:Eߍ7l6w˼PnnE@v7=l.EjD/zm:<[DE"_ "EsWʓb[˿ic\G ^tyNlWFҔ]=2ms!:5s+C-,`IN]Π5xw`lw8TS!d0 UL+ڧ~,! c#)FC8.4okE Μ2.N*mL>[+. ރyMOkW)$~=nB5O$en3zfٯZ3x6S K*(Fp kmʽQ>Tg )ԼuL"'sZKF(w2} Nb)q-HGz6 %>yv0;Y$c Aa NNU .92wu0SaςGgm; n i$%:l|IᶱMaŚ>)x.*1|k^ л>> `3=ЌR,UEKryϱ#}s|yx~zI#6$\0N"#pgqF]ΐR].a$_.[Pv,u,߯(Y,춨FpmX*X*VYQ\˜4-?\\]WPU9̩zV_kr(uۯ|ƽJNOΔLi Q +$;54 \]]wTצ3G>T0uY r-z={p5z@#85o+hW &eA.ѩx:#4''J댈J3Ia3w8eN|gH%1y( Ĭq6&^BF'Fدts|vZMӜHf`F‰20 eDM&i5r V[Wgn猫j V sgFŸy&ELZ?汼A9LV?1S-L,3N9!ZpV s(E7zxd11s"9_T&`8JH/ރɘZ*ꬕlf99j `F:`U`v"+7MwEXKf:҃jRSusTU?"y^T(ޏL[M H5=*; \ٮ6nt_0 ^1x!s#?^?ZnZ3l K `%/`h` xؔNo,T["+y_駯nvtP[nv|1~[}Ȫ?܏LK#RxTф$&3Q)G [~vUm٭rp*?oldX_mzU n@Ǿ0w5-;<7>BKS?lͪa a"_C3I}>fY˳Z YLϋ>x\mp?>XOR&xaA쨿'Iu{G٩no0%}zE;p࿊{å d< I،R&:q r+9RNS84I2 s޾OEwjip"snQ?03i0XgjQ>^=.ý ZNoˬ,׮z=%ӟܽ͢L|3tYXS\U@V~u4)-|W=áEyx8 )`ݿV˚_-Oݏ?-l?NϷa]fkϐԁM~ɓPM=)T[> 2; )m3R\:&G}t;%Xl 7J3mw" t_{; mE &x z )>v4L0JWΖP]^&:?"w7߼}R}[{(@6{ J&&)،DG4BSa{~ |`n΋3<$f+.s9x 'PD5UW-c̭DNDJc$&9n* 0Vd̠EcM-ޖ vƛ `gRc, F:HcD$5'Q<5L[{n ngd"u`Ί9֟Tw=ʭyԑzqr bHensPqi•ŵXڿ SyxTVZjS靿^r/K[\#D!< :Գ$55n4r" sRbĆiDiH meЍǀy `tnE5y~4ՋKHee?f;+QtF=[պ(iAu /.ݖUn?yy/}c#\czY,>ԱSn̶ʤ7UfZyfTGRR**H,2 ~D%5"`_?OВIBrn9l4#dc._ qX?0 !a_ƔJb_xL8>{,U"Z:o&N`Op\̴|bTl?r40Uۂjmmh==v)(Jm'~ =w13HW7Vu񮡽f:L,B_YQuZ߸F$$ar'Qc@ۛ u5Y.5zM/S>UG g"}w£C¨;+.-La2 C*; |"m%:/{ CߤQ6=8lF *dƜ}ʹ?괆:&cu occÄb# \D(fMrbsy{iෘ])OB%7,:{K3Ȍjiyq5ΈH9] 񧣐<~rtʓ=:$$"u * 2UX|1i\_^_%90|TVCޡy>B$\Q%΅J"*IZQƙdvX9oUC7_`ג*2'8R}T}sGny9b {u\wb< LGV6g`W= Z|^^5t:.YW=FPY%Cٌy$Q *b,s1]ylE#nR>0gD$&"f66QJAyTH0DP3T*ȼ>Oeʼnmw\`@x-m9u٫R>θ U3 Nt`QW].2X罴jzJ/2̨!bEGIC$8)xz70ɀu1C{hE՘>Tw9דjlУ%ٞQ!KnƐǚW*gٍΧ/F}LC2F$`+IRN"e(12%ŝM۩TA EWj80J\lcDs**'?S.Iӊ< I"]DN$Q;_~NRz5%fl_iSqqc˴mxrFFJ֘g r/5/Ku?&8kxj R8<13nՖŹ9qI5;4<)5/fUv3N|[o{CQ[Wp̩^ `ut GxY7 d'T(cąz:gK$p@p^!WqP3߀@㈡ lC Q\cX;E`: R;u5#pr/ZD}ߌ} y38elyMJ..ʵWU'a]YlUP6*[.^&u3U2Rk,u=!(LXC!w #1|ltL pRbaZzآkN~U iLow_Q=n?e#GapUCU | Ax2ڎVļ݊|lڪ-* 㣠oW(-d= a$7v>/^|;*E+# ΂,W,0Wy͢|\g_9%Tsʳג  g ʀAc1/x [wL~ 5ο-:/[)2fm'4;Z_[域jȧ?&ݿYfDyxqK q>aVdz̧yGDW!j$HxxZbqdǀ)I;F$3{8;=үvcN NP=via-b9 .!gǼ=}\P\+L++ `m3ifHr6(YB?uAɷFpFnx渚2kE`㪪A^0\ _c%idfNY'_(4ͤox1ة|F^|9?Op%*̋bzw \F<>^x\Y210&@k!g*i%rFf׽$-eTo&% 3gÞ=! !2c ]iƭRf AG4zsf^:1 t@fh@^}LgfiL-6&VL_͞!+&~Z_yJze׿<+G8Ʀ\f$``kARkɑe58fkyonwo!RΑUOgp/=wM:(Uu>^]L۶sv` *_7__롿vXXl ohs|uH-(.v%˂7=ދ@y#D{($"+UʗuK@C-c;0697t:fg ');B^ ldb&YQަ[tr#IqNΔS׳3D=e_IZ|zv䴾ٙBKU#&- WȄ fSRPx{#9gfn!f8;DN,{fCлe`v>$Z[yHz,1鱤\C]Dݨ#GheN;PrU:V x' EIbyC"j."`˱PP,l濷O7`DjfAޚ]+Ռ2z79 ɫ*zH$T( `V$}n29f9LJPb42FQĬA@̊Ffh6U*(zU@nkT~y`HCKp@f$alGa|̗Ew4HSIQg 3y b~ _iRb'ʯ2U]h٘3pT^F9aZ$Z=^*df a8G9N0|w?.J0̬a2Ǻa31B_)g kw= }b#423ǤOn,~}8B#+sd哺HF %NZۧW@TR!#*aXh=-:գNG_ζȎ_@ј.nM`Y$jt3A64w |27=V*ѝ~ڍo➰(BeƆzS:O7т؎uyݟB/k8[ƴ~£?|T ?ōB~5!% s^q W9cjujxB-k#QZw?ńn<ػq-x<!Mh) ζ=%9`_a<ߓ>}dOfþZ=;5F{,|/)9c+BIDÕA3ufoHUUӹP0"lPD@S0Q۴iUk%}M3L MA4,WQ5USǪjp$pF.8kKk i&PQPlӐzy ~|5'z{r[UB4㙖 # iU.j&X!KǖvAzH ĵ "CeDg0T15NRୱmfaΰ4v`' Ml Z10AY&SA`ǞM{vMGJv!<$" $ {,,]3@ i` 4,IXDZbֱX5Axhuw!Ա]WG&V-! aF\ *Աc`R/I]`Ea{;hZpPsIe6#_z8o#_p´&pNpDx-,ct5ǴI4aUMvETO\e1>)CӧͯE %NY|s\Ml]"U^]x3joj ^wHY1rMF݀E J#ȆOYJ< < OSd4:A-,f*?+5INu7 , Ȯ`/\W*ˈKk~G#AOn ܗYk+3iO>wME_" ޹bEΜZă"W ~-@`ۖRD`6ou b8a9˜DdR&5K$@kD |J@rI"D6~)gk '`[Oł ̛_2g_;SsWb+?9[|<\"/u_YGxXs4𩋧]ˤOA.-M/tiiWF,:@f2fi|mtm=C՚H/T>_֭Y'qyWgVI}"zx49iӎ>1L XރP:贤t$C`&3( |hmOiNjUn7mE5ew,n|jglg"]wǬX~F {'6;VgZ'?{qHWvǽ?I#{ta)æ4MA'VXwFSXLAoz<+*ٕ=L^nps#&SF m?|yi$h']' za otx«fLhOQ(3JB".> "^pSOiq#aZLyo#\eS8ٶ.#]MVT}?G=72M-S?CDH qA%nl B9?{ws`,mF+mF+mFkݱ,c(Z4lK`f2`*`  C"gƏ ?<ywlUrtcO oueOctʎe ض!l|U<+g͑eQmrDO"WxRatΛqZZG ,jX NFY=_D6n׼e\y"?|4;$_3qKRYb/yi$%]N"?7e˷h]L~r@a("EEf?|AL0''{fCz 8Q0!UĻ1 F"$K{s0s ֈ 2τW4>+KǁbtO& =*xӾ9 ΃*(v?T6p|LXW*DHt2bE y׬-i}l3#`^bB.W9p' З. 弴δm6@Z%BGeL.,WOꙿsT K-EL,nZHH0qk-EL{9Y[^lLK@(4ug/|D w7d_Ub^n$%d,gKކ⎘LJ6Pd8G[:c3{MZLۗ|!@:oNY"FvuD)뮌c̝أDC* REǻoW K X^-v%v^(zaع8ٖy9BO -ZTv,˅5hK2h(̖@)g.Lܕ'>n#˷duݢѤ[KU;CBvڒ^t1(dpĦ-#Mӯ.ʶ"d8)=7Ow !Qf*dDGv ב=5bv ~(G]z>4%m#Kp7x9i\\du}M{ߋG-c[ћ _"HBc=wmmԸѥXKSyz۴\N,ˉQ0`|i0gvg.`=e!HێBx"}\ԅIH}Rs~r) [hcq<:,YM_0bP Do`>u ]-v ݂a`-v  Tt dso>]J\wK]u0 waN׿_A]q׿k ft_/w]$gDv0_/w]qH̴Ɯj_ɻÌJew=ngz%z\*egJ^[W+B2WaqwUbWһbJ} Dt+4W#ʣ&hJPPFW*] f*uSy$†$+ jd,(oq)!հa| J+YO')J%1$)p'a硻.!v6"I@rhCS^Z*?굾~qqP,D_*'M¸|fBۼ>ŀoV MdGdWo%i-6{'q>X͔ m7$DAoʣUѪO5E{Y^9Q8RN0{T eK5M@Uhtc^*R0i1NYҫ Xg>*W"%?ū3!;!.#"E| 2UG{%cF5n)m:hErd9Rf+40K})aa?,+&UYT~ :> <#No2(;LKTEx4?Yk^1r۟_`|&r}[2y4 Oºxi.(-.3DZn_8{5lŠ_gܖyRe?L١eX^azW*\F3햟̮B;@|YVtUuKen(ڊFN|Igw][ΟGokNRe3mNN.ͥrucX`riVĴ r F*gox k|Rېȥ\*.uN!%tRWx=ܗ=W6O:"KY~U!VBjmfk'IfM߳!Ŵߣ$ߒ=]nDn|& ?\\ߕMO[EMO˕ZrZ!s ~wUbWb+Psݖ\zJ;[s]on+t+2WZH}:TȤSa\&yo,BRm:{gXib$2`dQȷ$DH檝Aman= ^K~7oldXuT ʑ!Iu48;Xz ΁(7aqZYguT7o֔(1nt#E-c7} q~JKJҀbTQ.b0N&r'O$0œť| Yk!sw)$e{)SR#kהi߂BTH|!j\ # GDÜoC!xEn?OÁ2f#+@yWNKy56r CWWzc#98mJ|Tভ [P^OabP"Tk%48x:o ˨toC!xH I~%#ng$3$Z0P0gXE5FG.0bC߆BZ֞R )#g m߂BFYIRSNf(`8il߂ w9FQ3^289r-(do0 313f&>pI6r'JvuKɄ2Kr@b=mJTÜoC!xʰ $%"to,2(miYkC! xHvq3a In\{7SI)%LKH`6m,x 2*Cf 9'na*1 ÐA*X`i-(s9B\#ĥB|%AJ%Hb68+N8 ' "F5UϷ2;xeŒ UZ+DcBb-ӺT#JKjHൡ22bD4oA!xALGB.X-8d|L,զ ΫU 9CZ[N2BQ1UZPDa % ;P^ 4N)"C`(ΰRi/B 96V{q(Zt;Z*Z9Ҵ(ۆB.B@£(TOqޗ(*S= XiĨ .DX H43]nA 7/!URޙ ɝi%n^K\ta*p_ONQ]:֒sCOn Ureg{gsSy쯚rd<{ѸO{s7!0<}Ut4on'0z"qH*G jj$* ї=\s 摝Yk^K1:16e %8xV2=O_gHusy\~9CPIS&!!u::n$m[g/9SorRD o85F=nx=ctķ0LkR!{$%>b!OʹCPQJĨڲzl=zptCͶ/O@͛vmø۷N^bQ33*`?GZ4ZoLٿoR<{\ ԢwZbM=La`4^$.~N,;wwZ8.u^x~ן?t[~t6t]#mi@ӖrY\rB}RVn!tkEnLӿ>G`w6?zL|LЊߏ| t{Bq{o{q@ťWۂu$?&F~N#yz8XETӪl0>z?:Hg<h&Q彿ư>N*ӵE_45N'; MMm<` b$u Bz\z9(';Uذzyә;0KnN/r:|ӡ ({mb?-4oօYQX'|n<Ǔw՜> AJtzTa*#t%Z y S ! vp }waoNzwt ;vݴCp'0eh.n*Wm 6_a?NFaYryϯ~}t/O=''Urc?‹X}V5އPF wD6tE#%Y?ŵ꘣ տO{O/o}9@ӓAv^-u?>'' ,NCdwP<`O7ZIOPNƸg7 _qtO. c +~^+RgqJ,7۪`˷7]hlU.cа~=ȴA ӫ/+h C ~8q69$Vx~O78X]M?/? a'uf>%}燃Ѡ{˧Y* qH |WsNڱuXj߮u dޛ{{oI?bG`w? _F_+,)]hh vQkq&qu;T0~!}S?b<~0ztsru/_XDW }V GM \km ! )! -S(4/`6a%)?7jI"E,"r9{pN)x6>u+̚gPN(Q(!Yo~pywuNp\0qY%eϪgswrYG}}i i{E{u qp1N[͏kK~mgJNmgOk6VeG.A}ꅾ4^ })<>^ǠK7 y}ׯ!ș+ՕAޔ?\;})[2s5{%02iNYgo=HqXϯ_r7i+ e.Sn~! \tK5pm.ū9oMkihg{1bn%wSPjT\_WUMpea\پSԵiSF.M z0О~e*1"|4Dë୍c!TĜ7OE 9Edtm,VZ<.u< ͪ#$k˻*ć0V'$<躿$Wc8)z+g"!o}-WQ[շhvc#RUEk?քyb[_ƊHX}$OZyғ2Z oƼR5[շXVo%ޖv>np[jmUR"jFyQW =U=x=uAg݆*QeI!wÐ7yYL-s7`{YZFnX#8I7g%z25^),m@4_m{AL@7 [/zEg9KA?4G];ٯF4/b&ǀ*8Q_SSO>G6KK _ZfX{v!ߊY9Ʌ'0.-['}r) Y"$(q}iTGLi=" ur{%T9}Fga4%ľR)&˄<Ճ{I2k]Jx9oDN 'x׃&SwFoaage.M[ڂtZat JOg=1}C<0i- ]4Ix%ͤy9ٲKhpC0s Vo?Ew:ڹbڝ0 & 'G!k1:0MQb8:@@QK'p3yuO2zMtUOt1c2\J8.pR$YJ_ZlTeHZcL{V|Z;Ȭ*io7or? \}Utw:FMt߳b:K97Wu/}X*jsi;D[8G{DG`MlQeGsjQVǾ}DR&zoȕ6<֪uWwVSA]ݍJf!k{OjSAG`NۣqW qWVܷs=]J]=Aw(ZHǣ&QPzDl<95 F b*u+'%JjbE}3 L:&)G!^;/F^U͔tP`RbU+W(פ2ޥn-VX+:(}E;]q*Zq0j֊㿶 c}G΃̠g,c{x~1U8ϼnW~;v.- X/6F`;~'.Cd')JZ>,"cBh#,o JU-;vޖ'}qUNXA )PF:p7κm-:p6ot "*` os>cqRس?jm'%,Q=`zS/~༻2OFi.LGv.bKEbv3_3InbiweV) b7o& 6k-U{痽0,ŭ^ ʡƥmҕd^QK8[S ^:=ȖZ;wgKp:AJ4*>6bч.Ci-4w@Ev\x<m[ÌWDlt ~ oˉC'NxS[K7)<\6-n]JYA%kWs 9DClل]&LDJjׅ ń50vG.84;Ow x\+np%km}_v})DHsp}Bp. f_6lI0&DŽ&8Ι49QextFV(N|+8u\#csk߿$1|&8Cεog/߹$`) XU"5x$gQW^RG$TZFXK\{=D- j'zMBEaI,D2ǍIT. ȎM̦qLRR"sb:eB .j eV1S)R `HfZK  1PR°TDΔH]rgW3Zpk"=خFTt)g)C+%hY $S=LZXpl-lš.ti QI3ĘS eP>TH26Q:$lh eVs=MfCTG;]!Aа*]lNh9+ψI1ZmX@i{:n>..) 1pC=N*㽰A%a'Z]R77@UIY(]p)k))I%i̭!Y)8lw`wHm"L[%DA$5# |)!6@22DXUwAvP0JD;˸Q4uXsk(7K]N9K !!:hK)x1!DO3ⶅI24ǧ@4:ZYX#Kb-H aȀl @kE`KNqɄ%\wHT8S@yUe_8̅ӐctPmV&ܪ7 C kޠ .Vڬ #J<h]n$Hԭ*Dt†"P4 Ѐ`:fDRNYq"ˎX h >oV#C JD{ MWe*b"| ǒFb)7wyHrIa;(Հސw|D8#MAB7[i.@fGT6v A " -L #Y"LP TUroM@ vzI%BA1n#j ZtYA4"< e_HhyTx$qYeW#kMF%9Vcޝ.rG%#qXO[RY))B ! Kb`,gB<@%oaYfֈ* kE~LL6)iҸ@yQZ-j&~YQ7ᢇ^Hlea80_\T6i 9QD>t&FXL\Uٳٶll\A`|-yUO6Xx-X'0\P8f?@(pFRt[5 ]p*f8Z%p1)Qv 1Ao&Vj$_SbB{RAJ H$1- D^`eH43D:X!n@$$/x4d5qnny(G A8KNwdP?># y0,Y*N$&dٻFn+WXڤ ʕ:ުyH=o -9LR3cMjIDI-=ꦲfWYpj`hWTMY˘4ї٩94LƪJEܺە=4ȱz |ejj($W"h 1cFJS#TcT"]:Tb{Z%0I=|Iƀ tIkVuA D o[BPTRiC: Z*!0 /D\$=*54%a F[s|d/z=%ZM%ʁCX6:e4qBOj ˆWTLV9c{ِBnt\pC:6ؠWj(FUQ(1_Q5u4F6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lau0i3$vH2c(y*R Rt^SFWdԱ6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauרS 同:U Ǩ$RƨCqu)FhP:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF5xz8$2xcpި7^Q5u0F86Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauƨssqtvB;A~)m~>Y<ܣ!+7 _;-+"Z_(1^RBٽ8]0{Utu/JV/CWBi:(=J0]=녴V$Pá+E4P1]B]Iu8tE ]ɫ}+H@o ճЕZh; `:9ZӕF(VW0JDW,J9"NW*WHWƈR )$N  ]C^LWѾ,DW'$Nt䵫HW\9d?F p})BWV"5ҕ y2Hp`u{ (ug/+o z0tEp`{㵫/]/KkJ8] /~?ʼ,]y!Jgɠ]IvϼtE ]\BW@`Pvf,CW_VP[  a<`yB@)tvv=+0m~X~C,JS]XՈ)/|)2{w0/5AhϔtFÓ|q'{w~t6MkuuȅՕ)etr nΗ~t:bwcwH6v_ꌋ |8P-]wum_}GU) (x KMA/ MRr}?E)y;ˍa;>~*|"vhy}8ۑ!;l퍾ĺZ!5# g>Mk+U>8N̳Ue{I"_dFmQZFbr=A[.j:m9о?]а!9=]jU ضFof?xu+U.](̨w{{zpR=wFyluRgvFX8ENvRB0u0]~1ڪGָ-jؼoIFo#ótQporPgEEga^~KBݷJ❽W*.qI8޳0Ře+($zǣ(_?> :6%>IQ7͑>}͑wmf iӊpeG.]+m)ݞb}c}o>ZZp߱<]:'He_ھN%p`dROd7m"C!*idxA9<8ĈsMb]Z|M;]LdylRZtNTaoFqLOGiv!Ft:JcA-Nha36,t$itsiv'r__s.E]M {zQ f!wŊvޮ?VD̦'[UCDhJ)J 7=jgDWlp\;=֋}+v0ճЕ&ew=feyzi0A ,PhLk[GHUb;*&:< D{^ĵP@ Iak>=;ÿ.^jy7B0ߨ(d/_HQq?m>@z1R '\UʡB,RQ=7zb?#yyGXiN Eye Ⓞ5Ul}q?>+}pf_ҩ"tC(k+\DWx@~` !OWόU8^;7=+B[W{c M5 iiGE1'#D/=~\{97jDnFyUuwYI'\G˥)H܍*r~^. H4aYi챨tlLIW)HdbNXRU)i詓&ДjatNQ&h\קk8/| :Egg)0-j91*h?1D:*<O#h$/"Gs ]m4rKfݿO&e߼<~t̬u!d4v|9H1_]<ʲG7I fjTʪwSl~|t$duX8VG(N_rN%dE<]uNr\B,7/_:0Uzqytvt¸ZǪ|,i@*uS@uv6ڙ\)!9|L32M ,sBRRѨ:WZo>%e]zOW/fCY=4k[7ѽCXϭfT:o`ٷY Ő`] )'5n(O9 mgÞ>$U>4;tY{t[9Y'Y=yzLvKyΦТMd{籜ޙ^{2myq彳˧XhOǽ\刀7] y:֩͊GUV\4ޛFnn13|v.p69"8!/a|`MYWU4Ch X}Q ]7kk\Ti0d+]Le+mRIEVO:eF7*9KaUI Y޺q_}+4LpO˟g L]4[w5mF%foHg\{ȱWcƴ>._lO';`{ztzD;tsoɖĊ$Yb7Ķ,.Y,2q.@V&9Qׁ:I7>?TWԌAJ."ZRQn-u5]y%tE!ѕW :]W(C~f JkP {FW pu(b?J#c\(cX~C܋ 5wՏR%v+d]zH~.EWHuE2jRZ~6"\\tzuEf]PWZtAɮW."ڧ>p]YW#h/b+V²*EWDd+9jvAqע dbalW\UXQ녱-$"௓U}w)ZA UO6^7_F{="3n#X1UIn|;Ycv}_>u1fnL1÷_uz(|(ﱡ]KzPA~ ]dlŶǀLfs{h>!JG4cXIFB`|noU\tE.Dtu~9TW qOUrk}nxt!` ?Wɮ6}]VYW#6O9 A89.BZ/t"Ju|te:ޜ c< ?w~C?֋6ȁ~mdzd]zItEj."ڡQu5B])+}]) nKW+C-E `F+mqtV[6"\\tN%]%@u{m O%tEZpњuE>j N/VzUUzg]w|Xʝxh{٢?||J罹1%i 3FFq,:f2Y'Ѭu`$A,{lteQ"p҂ҩ )ME0J]9tױEZ$(̺&}|5]!-J]WHiR(=$ (ptN6B\/]mH^WHlѕz{.|n`/`+>xyk?Z3(mbw4m]٬}C/ $6B\']S?q2CN+etmD6lC]ެ.X3iޞ]/8*V7r EL몌pѿ=ǫ/7߼E~Xu\uDz?vW޷gZ |3AL*j]i.Z?~L 2P??=vLMcYb.Uc5uΥqW6Kݪ. /v[w6v/7tg<=P 6%uQBhן=*2p_tyƟţ[i_z*.@Y/VoYx}G>4i67oV?Kr2/}%>{H8M >[b>\W~pUUE}mg\7ٶ>aVD.YN٤'\9弋Y` .g+sf_LsZ1 7O._N ynT!GM_ceqߜOެO׫[9?e&z;%w]ۋ|zVwuprA#ߗj1{+=V足'\Ѐf\`̛ԾM7W#AQɁ9@;@wأ{Avc+K8^x8J|Mǫ;W9Y!cu4?5s{ndl -39R ߔ߾|=j 6CvEHx|zz{X3tZ%].YY.w8juiaGӖ~^__ݰ*{{Zݜ wʗC=z~>(?IiͽB=| y6X8Rf ?ڹ#e`1CnSNgQJlOE["]H]I3 '!myk ݢϧ[|~&u'v7x. kr?J§.|\t)1ltjduӔo0oG dba3Q+l۱ۻ-$"௓U*u'* WRҵʵ碹7_Fn?X=:N0rrliv I9OO"ih+N`4A#R..#uɏh2=)$#8 qltE^q2/&JǨ+'֜'`GWdWH R2d]PW^89eWlDru(h?J1*He+֎b]I~(]5u ;^}^\#sWh(ES;Е˺7R*0ltErjtu9f5SO|rPtGHk[{xMKBn\LveIȬ^M6| Ә7_cH`^4 5O==|Dօ<:)8OFW,]!!u]YW#ԕs4[]9g?-uEy(u坓 ltֳ&7;} Ⱥ8銀ltE:Df?#]k솞jE+@[bԉ-=t &$0j- qa`/Zk}BJٞYW'ѕ!mz;p+¤+"jAF"`'jq=]m )YW##f+6J."Z`(ϺLprB`),]d3NZ+]QW6`+u6$?wABu`-phH]WH(uEbNX)>sW ltڽSQu\teDq.zp]/d+小8|$+#jڬ}B/ 6"\ i@'LJI0ltEMvE6(̺t]xH>AUl+҈ B0LO\\tE֤+t6jrahW0H`n?+F?3V7_-L`}$9l*dmsc_~*P0ϯ'߽~}6YWgz._UDjRPtLb`4"`6BHkeH}EaYVX{n{mjRiXMb]gDiuE*gWcԕ'"܁IlBJ#z1*ҞLgZ6sWHk'şdsLU$ ?w5KO0.hhf?J؊&CW2jK6FB`|tE'+ !jRB>F0ZEWDCBJ)\uH`ڱxK?ڧ"2d]=|z Uƛ ٱk&|;uvqA%F)7>KqWNޞ۷?ͽ/QeE{iPԾZV%ޝP;WmKgk?62bqQ^KWn/|eBgu ye[Kl~Vlqǧ {~ۇ?[?E޼Cuq_vnw6/mwSqO֥"7o~V\U7^Vevk ~"z|zޣZv4 aVG<쟁y^) A 1Ăn; ۯ_n $zZ_{8SĦo]Tƾ,BƗQke|SKuT!mjQԲV4GiK5wk\")hl}F>[|QWk@UUÅhUe)DԾ)rEFymQ %@V&xcV|_." }E ' e, eq(*UVZ5_XIhl,;wU4J+%feUQac0ۚb# iJ,sByǤuu[^0JJ FTX*Fꦒ)b1ºBٝX^oC.痗7WRjKMcLYJ[!pPԈ,.02`j  $6-=VuBe vf cC`k1 X7(UTft R>4erw#-ZbcZVһjjGeAP(tXv0 -h5 [PQɢh0Th=KAp1ثUxהq4~%|P^V"v fp`̙"K%[`3A ыhh*]$j3t%#1d aXXoF 0+@$8RD&Xr)e̥T39y}ts@]sy$u0w6AJ_D2{ LI-C^XDBS9dDera'AA{TtSQ!]`(- UP(d=+Ym(l%d]^8BU*w(Eԥ@Tke] %IErdS:T0vo-x_ 4d~!ETu1@L`(p5>PL:BCD:S2w fҊY) K"Eyib|vNY!`0ȾS24ꤏ̪B;4w}iւFh2f=2+3j5dU@@P› Z0Ni#xDJP Pt1 lʳZr BJm'e,qǀ@A9g 1@WlwXH6VΨ, ˆ2\ڬ(T<89SC$e#,b1XTЩ5I+3YZ(u7A|Q H&".ʃZi.+e q$+JX%' BzV2xi:6K+,3&NcY(ϐ̼M;+M,3n(D,J00 (ˀoZdJ`,R!J@+` O^yC^3A~M; qsLJ  D]M gf2)6*$&-ݶPжe)Y:ZEt4`enzI s' +E$d5&N. \`_".<<`X0֥;'Y1 @ TQI/"P^#ׁ ~ x0}GâS*rpcQBx\b@*hZX :13Ef%4>BB{kOH"[ܳP]e(6<-tM d0 p;^y$k^ P ˹FZ"@Z([qԋB!WZ DF9tE[ B#?ejO^Sh((q˴ Ņ.Iɑn9 bZƬ y\JY-j b!wѱiVMgc{EC] b4rM!UL*"tsD :/uΘeb+m뽸n8LNqL PfڧՄMrdiF_o9{ qzCɐ}(4,⟏j"<_O{Dck0 vg#pVcI~?=֮4{!&۵`@RS3Wm{-O&zƋ5Yˀ46/m+MglZ =@ƻVOD?HG<ǰ!CnxHOu2:@)uFYȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:db:"yHFQ'c)QhEuRr2|Fi>WHF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!ΟfQ"g!uXǨ:@koQ:_QԖ2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uXN=$NQp~0F]P5u:_Q]CF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:dr:|/:+/i'hj{}_4cO`Gysp|EU-x_PrMWKh,#P++]%J=tѦUBi(]}6H ;:]ॲbvSOGͫ֌Dn%InOp ֝7>͊FM핍_m4z_Mؘۘ6;hr6S>`y|15&2B-ߊ0n;~r֟ÓTџq$z@_ݟeooGUV]h66"Gd _nh+R4~g6zhJUn^Yxxj|{)}=Nk/mۋt?, qN f٪]k6fsu]ߝd2`{&Ͷӏ9cܴ7N7#@,:S9w6LM3?he퟉WxG ?^b|}yl4#Ex֋Ѝe뻎'B~TOW+yZnN[RiI 8e<jv~7\V>?n]F;۽w۽qt {Yqqj_%؝4F;SyKtdvz'M'_nm$7U.ү^ g'q't7_dU^M3mU*h:r~5;k4roeHG EsW`H;Uɐn']v֭pw^nܯOOwSnvvvnƺ[u[>>=[GjQJȞ6p&ェKyõd( EP`ڔ\(M^ зv`BGX`pPt5ZOv탟;{Z(5&yzIti#GF+ҳX!K["Jesy !܊]]wv7 nv0gt~zjAjc{I~6xՎ |yXͭnMSjYgsp}?|hmV =\\"pǕpEHFYW0)P}c@yp[> Ov=t탇nt_:˧>z^/7L;g`4Ybƒ  fiːV|[n=|렎*ْ¥q\JXƥT@ڧL(r@05>%pvl#6PFըDˋFV)ɢp&@+mIE0C#@M2/$v}Bdkɓd濿jRm&bT 'Fa`>pi/op*7c\Op%Y|zd1D s77=7J)Ffv/L9  (IΫ7oN&K^ޮͧK9 vWtzP2>gaேsf NpjcD0CЛ)CӪ^"w^R2Hgo9tUBTs Zz>tBGAt|Vpj ]%D;]%t JAuߗ0S1tp9MVBdKWϐ49 .aMc<# kݰiQfoX v赎^Z} %FG4nAW]Sn]`Ec CwZTRt 0t +CW)tE PIZztEBGa+buUKqS*GOW"3+ SAthN*U]ZA>vP6v,R׃3s˴'IN{~q~%;cB)7j<۔GVGEdQ0A[R.k%#RHD|e\w.tG5C|_o?=_yIAex)=PKC 0!E.A_פhMr8g#D5;qXH]ymCHX,8Y68}m>IMx1X/ϫ%7ih 5d(§Y'Ɇ1\4=ȹ+GNt`En"{8lUй?o<*ɡ+~(:J^<8]F6cw9NME~_&;s}>g%OoI 駳lCO%/ס]Eh@#ꑼ>>@cO|ZY͗>r/u+M|;s2*v;yqxc)§[bSo5ܵY"w*)e!/,̋HjvsP0vY4a#A2?9HysQ b`޿)7R%aS۬ЂZS7@a^0V0Ҿ`tΗK*\QǧFNg9wvo}`e0)07ĥ%su01kBSo-OޥP28NLj9+r4*'uW@dˁ f0&WZ>ݴNf-?ڤi%Vaq+A(}SIhD%Ng 7jwfΊ k\/[d@.g[1Ap] fL0|ͬ@3/ /݄r܍@/0>¨uH|}ߺڳ M* ayo͆`zH QI9qe`+:t+]nf^lqa1].[{}#j 3={:VeP6PcJX.b\W<U^IٽAozG4 L;p9-Zlb)˩E`I#:P*`- WbYzXU1U人equ4^ѯ/hΠXB-kX`.9 р_̩N݆pVDZUd V>j&cAM@(iI!có<6qxv,JV**NCŒ#Ƃ/&Kk9[ÃN()ya6h m|cB$I`!)fϙoo%=&Gҋ0fKڃ|3߻Vj乯LrY]>UYӭkx4(}6Ta!B {¤P [BH0vjSjnmUs?9 nT:B QD$ɵVPOU4scipҝ3vp=ŵAPI((> 3i82 RjDs$rŐ*"izV ,׫<5uxNPc&*@R3G"$z|!eZ@s!>h 4,rbU:Uϧ"ހ,Йr<04HL GQ3A شVmIe'~;,焧 VDDũaQ4 ͂Ic+Y/0A`"Z⨅8j66hlVioo1)\5l HdcQIG@%"`V_Dly3!6Ux7]C<0k"J⃂PK'5RhX'I2hVQzXߎE;j]CԬ9AoGi`ujI-d#䑧\$7$ڐ還v-ʵ KkAӦ0Y_ZO r3̽ɀuf &8d,{D^y(8j-p_<L|NBt2NOE^G*#RIE$s2Ja@;$"_ rm1(܋0n(қa\zpp"xdiNXR눁~"I)5Gɴ6DÒ=ηOI~ڡ鱆M[b3وLX-c`PAc,#cdHo[EWv Mև II0og mi 9sE}l!|.b"a~Os1wqJCIFƅHQQsr72A}2<(b:Kd"Eg'}h*F"ҽF )(uhl[w4ꆾ&+a [iC(Xe-,IV%8#Ay~Mz43]r\UJZ/YUkŻr \a!b&Gx;>]Gta1(TbP2{QXoP%>7ަWo?c_w`8t\^OUSIg{Y=W}ޫ~t.Nu.ͨ&*fDZ/YC_ {WŁ>QMlUW@+L*A_HYg_L`z+a^f|N! ,̀pyf3x67wtm6[2a"&2N Nї_ Uc޵6nc.pݑ#@qow>.(xDZ=y,s$?N$[X4HdH~!U.H 5NzOIscLznvx7@SS'Ӛ16TB{SjqRI2Ő7񉧩is::dų|;1=8OQdĺZ\+f exZJa Yjvll'g)*O*: gSvۡn }!&Χ]DR7/kPBduDB޸(VxFS&m ?;!5v #*C]W+ O6XI_I0WyMe;,|8{bǫr`)dv''6|fC1f;ŜHQ4tgR-!XWcU}M$GqӷJThX+ bILj!9SY b*O2'=NVOU{Ϗn"l)LRefմV׼7_=X_dSbE!υCUwøh)q\O;Nz*r "8(??)"VTrJRDD+[Qj%E%=;^R^Z5Q;z%n[y ^ɹ̼0SE"!6elahK'ڛ\-_f㡏)622"KѻzW2*O:ÌrbWjmUFzTqE4ݭ[FuYmm`fԠ~xqOGO%(XW$tRdb͚LF,$DZ0niy)ﹳ}wHJ'/Jbxz|e6ܰv `䍡Ww_)}i85ltC_=f_ LK־]_2GP Ӯ6]IYCυF:)żkLfŝP_opeTА[Do }*[}X2vI7Ɔ{V^3(oQ ` QບޭwLKƫZ3^i9Qu KfT2n[lPYtu7ĭ7ɚk3w+*>I75ʽrd ^YzC̫yJ #T/0Ͷy[Hc9y:vp#(%fǻOnyj_nhuqQ/zMq8L~Kn|>}W`E>/*7P &JB8 `d (yǫȫB Xl f њ{trn ,CWW"ZNWճ+UUpmlcUxqZ!CWPjЕj߮dZXR ]!\C+D{,(]`l0tpE0tho;]!JݩS+.b2 B:up&t(JHF0#$BLBWV]"]iKK`ΨyA(L-ٶygjYozIRx5iwrZEw&31!`04`<DU=DiyќG5DW* ]!\R֯#JŋObRuDBWV~5Q* ҕQm8 ſ+D[ "Jm:E­--xO`E0tpE0nj<4gCWb>\睩ذcU=>]ZKL]t;ڷ֪6lI0tp5 0m+DٶUGWOBW+e@t P vBFttutō DWXCW׈P ^v=%JE e4B +i%: B2P;$BΠ:ERQ]`QWWK ]ZKxJYNJ?UQR:֊ӕtbw- g#=vzheCRwCWbכ>!U-zpU=H)-njЕj߮1j+lá+ ]CTGW'HWJJl@t e%] ]cE[}`&i0tp-'WWNJ_©W ]!ZfNWRNزІ]*_%'oW(͙ g^:9("፟F_>I^:8G}jCJ+S }2Os{K?롃+0^6_[&гM8yI.z!&r/sMǽ{7G'% xuK^O6;.G^;3qO~G>>Jkմ8}/{ף|*\E–(գqK-g orTmyal"r>\|=0CRs>N&эz/1.lWxg7\r~ g<*b:wIgRPXq*?ȨvI1W+])W?oT~iV꨷g+ijT ^GnX8S^lC>r<~ˡ=x$,h[,{=Gc)".z?P(r>.)}ՂCz %3,j7͓Io<L@L9@BJJ=Z|rqoUty跭VrzuSwͮ{^a8&ʓmD̅ղKb Zrvge\dCYLmgJc.w/+Zh ]9%㫫XŨTh?`µt}F7+ e_h]Nȥތ .3xN\*MF#H,l8U_/ؖUe=Z.tO5`CwF\pp*DM܊O9Aoޏp$/{8nj08]"X%ؠ]XnI&jBȌKI%:LXg h-2a$1IXL+Mxƺف!ۆtAۍ`zcQx{nC{)=ϧɼG77n:Z{P\.psW4ngۺ,µ2-ri1l&jbK7k|0yq|袇[5|(5>e^H\'Ւ] ~5* WBKk|e:2D<YylSzYFxj%TIn??6>ofNmWbLXEzeQ2IE˃ytz(^YNd0fRwHI3SЊ 32٦wy&L qj;5傩2;=*!ƹ؝yũ[?w&ylG\ K7+ [1u(_@&;R]gO$5aD0Y8 9<*Q<} x(۟QJuYiZ,9bh,Br`]wv$tBI ʭ*P T$) $eFHse5pݧG}G&I|OV^TCk6)/oE:uY>]2\eB(.4_T-V BDK+L n  #YbaZLy'In^8&$5&DƦZQB=U8rIƄ;g s=ŭʙAPI(>3i82 RjDs$rŐ*"ir8 ,o@rFŨpyDQJbTD#LPD<RღfENX2t"Ц\=+z)(gaJ(M`cL[˔P*p5c.D@>PMjj`_vqo}8Aъ8՞90FY0ɶ2s0$h)V^fpeDLnP a̹`,"08 c0Z[xNˈlqoWːO)|c,|e%J(L<0k"Jt(륓J)4NZȃNqe[ģ=X8c]^t!ѲtMaufpbޓZYn[-ϡ\$w$RmHYՂTׂH!U Rh^ů-ՇD|*0&uҙ1dX 0w!RO7xyv9c Q'u<@$2XJDQiJ;"D+9BF#O]YXGDb&va(h7N<-pGibϟ+ԐeeE3|^q$Gj! CF,vsb>E(8j-p_>\| fԙgM%HwDta2MIv2JawZIA"A}E30Y[]M$O6a-G2$:%NJ A<जYF_n!q].ΓTָ/$KKy%ow{UG[ '{$B8`)"ǗL J1uf2 vp}<8ͯߕ@0!,PY ˥CRjp .2X9wvóω0RO#ミOZK]\̱}U6_/E'Ӌ˓2d:W=X؊s0!y?ҕ*zO'+n]Q\y_x=?xU# gT`v)Q60歱ޜ0|I"kwJӫvۆod6ˬЇ(QdGu͞vW޵|Ⱥ]k#.$u 9F| l+RިhuK?XݱgXX\9_~߼}cϿ?~o>|_|qDo ?6 |6M%<껅ϩJއٟmFuyt;7^:Q't|'9EЧ?Qm!P"*րRI94W6f߿ ݅Uqn@]^xюI+9?Fz>[rQ= &eX 諻 ?ucyGJQ q4] _vPǤxKQ8D0M=֩VNMԒꔒ ~B NE]PH -2Ę['n0rc uX꙲y  ԭAkYC!T+P{UNO"z3?ȸy:WilX̴Q A3ɬHy<,s 7zC[P0vY4a#!nZ{9z\Tknl7A7M;ȋ<.FVoG_~MndMUB[FAIQcX D *? \gJw|@ >C4 XF 'Epm:+:zl\zR|ˌ/ 2DVt}Nnr1Dx+48_Fd -!Mn!7x,qܼ x1v()_+|1OxtTG'GNeWH0p0o>g3gR3M֘@]<zǏgָ,goux?7w6#Fևh̏ܗ0v,#p+6Se @'LO'n:`0:OA..#Fc.5w[r8c>ul@ oYmA6X+$-sI f V4L8jx1BЮ(+ Ю(+BЮ(+ B{GəG".|O?qG>zKG?Z.Ť[" 90rZ+!229BQrǢsT1(EGjaor)5]*Ok+~y7BHqEe 7.Xuw94ūkhyaR/1!Jd!R&R1тFQ4`pHctP ZMqjG^7QՆ&gf6=/Ywϯ_`4*kbjM̨3hQ9b0XJBԀe`Jsbdŵeσ;\.جFi&bE# ~z:MGT"Fb4. V:ЖCe)q[A|xNhnP5+[;@i҂-fكPhRN D9d06Ae#&ly5UD&4 Ȗ*AN+O{HFqVB0ڱ^}͓[Q8js;Rv5.僬C K2nL&f=ri$N_:d.6rNAhwΆy^-ϔb+etsOMayC1ZѤ56(UoLIvey'vh}񃑞])]|5{WQto._S>Q?jd>1tx6Lí8K z ^a<!V8zUAY:Inms0T:knfCGdw)0EM3xig]f@eU.z0R/C wc+?xCXSu'Wjy$Q8 )P"}2TqYw4eCfʂ? 'fߵPwi4n}7;fEk~̒5kV ˋ:x0&uOљ1dX 0bTyQm:"vm@D7ֹiZ~$6[q&X4_@r\~־z:+.[~.+n|3UNVY`<{n~p,ה] ~ϊr'gf$02{ƞR@]OMݐnlfU>0: (,x4Zzz|0j애{]|dS + gba!!cxR)h>ٸ bШFįecQy;s3`ǿ򯏧ߝ|7/N)&'}V/@Y~4{ =p=s*5^-}1~~;Tud|~V(33Bŵ"Wm@!#+0r4sUQm Ul͹`~#H~WU~ݹ/Tq?^[HXNH`*ǘ1h4)e?mZkHV;p=Rrߚ׽,HO1nI#%2pȉn z"^MԒ B Lީ{B慨P<=0 Ge8O`"봱3e%+&,:\oB@ZYbOm[FZ㉵2,"7VKGv6l,EUte#eukxՐ);PPieQty2\0s,M.m$k/9v3|%gSԄ'÷ E;g4=gϓy._r ʉ#{Ō%jJ{Sx?f^6LrJYꪍy:xDc9/2=AP,Xv=&Dp0ѪPbEBjNӷj4!IO|"Py x J/݊#Õwfq ҏ|R,&jq7+K;d4-u1t!+W˭ Y n.p 90n̮]b2t>O=Lj .f^Ms3 ̆#~ւ%7́F![cI"!t="-||69]sL(QmGs &T5( -J)~`#=MF{UnR7 \=]-Re }%_ZÇt+;tI^ðWKU\qXt">x LqY̙C:<Ncʠz7ه MHgEh.PHx3)Ij|NKY^}jShM{u6in Wጠ߇)0t?_T*=k@bD*cU[և)[&&?ܲ+_m#+} {!$9{ͺ:rTdj ħᮉz@?&YWC&DLD^ŻͤLjuO+ɢ3?+ٟCt|e㢛~(Lc$s}ҙ5.IZՆ_ tngR[&~1δRՁU6\H `#*weG,T4 7Yߕ[xKsʀߑB7'ב V鍋@J8f^dӹs!~RڞiyRlv'-5}M2s,dd3JwPGODDW%iWd~U鈻]wFW m]UxGd~;u/?a7B]'AR$`-L mlOO`2SN Of;BZ)Qze;\%_U¶ׇ!:=g1-uRwhBVP^jpJ.s$0&OE#==B;ZC~[}&BiԵ]H]WEw'+[w$Eᄉ5 &q^ʲo4yr&ε@Y`o9̢؆eueiߢc2Q1W_{@c u[㩆׍V#)U:!#0G; EF)`,1=qM}(76n"TQ\6j0-V BDK+L n \"#YbaZLy/Inf2pӗ'IG1!0(hS9΍$ÄӌXGe*gEC% #D8hQ\1 A*!lhڲ6-嬐?^:|퇐mD$fHEHX9BNt(!>h jY*)C/mKJQoa2y"3L) airkJfE( VkP͆Oe'=.~{+ OS hl+Y/0#a OVhEqg=2}yxma·"Wobw2ޙ'iY9/:.g7- Ct7fW`C/AmN~sIk~4=*{0MP(L,Zq L{gH%s&@£ڬ̶5k̨-S:JKiS ~<[(yz#5\R":eQ*q8(_O~B+c:I{^$Sp)Pʍ"VFp ӎȢ`;X+\$J!RHD#itwpouv`堦<'ie'| NhzrqnAݢ+}NFh2z; mR?"#:KTѻO0XoA 9W\YlRix`,(per+ҞK$Ť5pp[MW΄o}>(_pK1F: lp65 !,[;)oԗshuOcm2n6X!ivf.@UH$ "YiG.9Jz, 5j%(aH.YJ]5V]׶F5S0i@U^|Q =Flw wa_=\Dۆ}Ið@}Utۜ+n9u<2tny$XLjA%nネ&c={'=h\(wp)B!cF6<0@S!WL#,8ZI}A79n7HZ/Afl%9~/aVb%KDJVUd[`[X#1% ݾk*8=aEqQ<)L0Y'rf+/Aa p}(]q};O;R2J-;?6MLG eZ30{-#DMFSignwSA.%yn8gl֓ʺCR5jxT8!,$S6ՓH6ь=k6s[@0"1:Yr~amc5&Mk&GkQ |lGm7TcL)lE^qkKZ(}[yoæ~omvz5uՂZ=r3YEE#Kc>OP fh/#Et$LEt!"'U:`!@oe_E.qVq1 A0 ('ւc}9@Aӆ2 10BB23uS/INwmI 2RɞM|AE~Z)RPr~P$!iqHQOuOUDXҠ8KfkCuI.%%lB1MBm(vWU6*=v.σNsFJeuY$LVi S>)\SƘcxto#,jy3RSr1}0`;WCǫv=y(bN<'xb^4v\Xcf+ZLY~}-d[WgYm[0{cI*E>g̅.1P:q׿9#qbUmŨsoRAI%`32;o~Y"I"-8}fxףűj4ԫ{Q!2X@F)J0D+OGh)΃-7QYKlBSc*5(K)µDX/5j~br6<;:֞O]{^}0W-%Jz.lrm'7;GZg6fdz2鸁w:dm9*HFDK<89.X .0A4pf2- h$:|ME.(:oZ?,^HaRYQ qXD' Lx}< 1 Bb;/Ѱ?,aO-Q#HsĒZG 'BRyL jOjX'붦}mMxga/ZZ! ±3eǒYF,,vԮ 5oĄeN牘5gu6S3>NBZ3{B$9{.Ocgɉ< 1q⤲}J7%q( y0ECK}F@_"yy3Ň{ƌC ?zRq}ۭ%hPlG %eQ,=%+-r#g eɫ*I |PiG|@I`JENni-߫E.7%=M^e𢡊ʿ~]Y5ɕU_ưE_Ϫܛ<;+=ps{7fw`,MIM)䓝}-^==Au cxMV0#)6g|6>)yDP]󨜾uYϫYv]{?:2볛mg-B wE{Ena?- /apQB6ƒEjCvrxaM39 C9DŽRn&x92JSȢ`;X+\$JO.qy[ h]fqVn^|ڊ*^p5//IyraLjCgz ~:lË ;ȧd[qIBoDϡ3HQ>'E/ A%b?n7 DZ|i҉i+Li+wCaO2=s~3ģ;++e i>T3.$f8tLbF ]r-/4o湍R`Y059VC|w>N}v)ЎaY'7 L;p`h!bLYN@`,"LցN[o'?W5vlBݝ&On&ʧ]ͷdp, r3-}D9 >Ѯr<(Z^CI* Oacº4Ή Z,9bh,B9u^r]wv $tBI  ( e5BIAI0ˌ4ecgilx+C2pT+5`rIM"h uVJWe+ ZX&V@n `Ƒ`,10-$7T3JN;9;PcBIal2%Sc* ǹoL8 Fsn.U93(* G |8hQ\1 A*!lhڲFΆrvx+v+.Jb9"!cc9-՞@sB45,rbʜNE~b)(gaJ(M`cL[˔P*p5c.D@>PMZZn?$O'6Z3F(4 &*f6 ]HHЪS(:j'IŨf07H)aH\0IuSK`BI@k'myDTk0`\zJkE+|e#Q6`kSfTDI|PzX'-A'I25[щG{1c^t!Ѳt)L[|rs)D XX{US ϳw ]go$8Y %rp_ yrJTZMj <~0%dJ?>aEq!0闕TV?,W7R}r6E]JmE3`\p ܙԿe+%a{ZIWԫଝR?}<eXF.~̯??j׹O7MqV}/guZ={ؓlC7ŊWV6R`tcj?l'A/ʡqd Aĸ@cN kTnh`1bMp]H|ޠnDt>/f(YftWڴ +gxzJR$,|^70e+WpK@jmxy5!zMH x$ )@b;cƌL"^ˈiDk45["(~j(ѩ$N5T_=b?[)h̐$5jxT~8!,$S{+%R >nƎ}>i jtfeF"\]GJaQ3\?w7ɚP2zEx#d+8>jK ^Ȍ ^,93Y{.C']Jה4M:Qh.N]yr?{ŸUq1E# |4 j-$* %^F@HHLBDN6tAn}FY1 )fe  D9{mHMᑱ>r iCX:dɘV!!: $H[XA`Ir,a2:ֆ58-;I$]JQKb!REmq.σA[H x wɩkhcOT(O0tS$3c̱|< ;kfL(JU8˝R 3 (xC+^ _Bxj0;hTkmS>ځ.#kڰvЭB<."E}0 X\'@| ٿK~m e^LN>pݽ/aRhqE xJO(֪9'1A-R!ޕq$2] #}Y'll5}X XCS3E*$D^oġ(qD0%4kzZG##HpiW[F55ٿ|,Xqe|bbvӬSfRY٠e"zsJ:9<%Ǟ{>;Gn8RD[XJIi;w`*Ԙ["q,<ۘ;h+v{KTh[J'FtrrAc\h-LYʶf1gŒSxXT.wqj2A0ɵ r B($]b ,cH%Ϙ >HbaJ;jK^3Qvmwg1+-" `㔁@f/'yn yqGgVb>^|!zEwM:.u) y.+&ˠr8΀hJhx jsU_xHpl+ *dvWB/Dh6WHCsKFDomfAU53҆]*mI!D%T8 y Қ)KI S"IgZ,-,xGI4,UVH\!q9Epny*_m]+܅4uaPLGl8p۔qڤ8p$ sf),X"UQufk]^rj֖@W ;:2V˩8p"rPfN'{01d Y_tDrsk ]xV pUDe6RrRyGE0|KȻ`إ$r/*EVJRV`$كiy90bQY+냉KM-q(H8 qZl=?>n?ryUUU|nn&e9зK,w9mK1VYEik6ʌ ;6`#*$;QvjJs0bg{ˀ_{ j8^4CgZ3)9~/a5q-K &uD9,kcBv&zGv0.2[tQM26]Wf ˹_сU7Z 4L?F{EaqnaNraly s{R5(§Y;z~' kxC2)B)7j<[i'( -J))$K,3Z7Fj[^5[}їW}DԀja"Lh/!-/6Xy`yD>giߪsNzΐ : -E|AOWW{ Zz=]G&;[Y-yFVc>lb $'\J9<%֚`u Cτ?9Hg]]bY*/rp~{ {]bn„@;sm#i˙ 8.q3&ReAS̟lgIO"%_4E}O5h{s6.SR< s8Yt0m(C +q\<>'2j?mhVoXBbvy֫NN 'N:4AN $gr+d1dRȴt؊63FofU)Bjqg4o)O 4{J eur{ (뼠45O*ۄ;hOıuOVͷp, r3-}D9 >Q9<+P?ϓh*JRpaº4Ή `\-1|4|!:/9Xs]I+PR2pDBJAY&DyaP2#$ ;2jVҡ6/+$B{`N*x繯b7#Oշ"mk>z]+ꎒQvT UhzmE0)B7pK+4cgi!0&kkwrv"Ƅ("ZhS9΍$`ipҝ3vV̠h$a`dpD8hQ\1 A*!lhڲFΆrV՗W fuFM;MD$fHEHX9BNKj10MA:X"eDM}-_) ?LA9c SDiCcDZRs JD`l:r>o8Aъ8՞90FY0ɶ2s0$h)V^f5#""&E7m:aH\0IuS'0Z[ [ ""@()ezX[:e/L<0k"J2!IxL'-A'I2hVtўxLXw"qSC,k ތauRmļ4ɋ&ӽ ])7NHrAVL찛*i<=y B$a|p U}O_8*>ώ=\ykob\)\ Gw,{@tي\û﷣祶.d0.(eiJ8E0 LS$T&GVp( @!` Vmᩚ.{4-M{9i f@h\Rge՝ޠ~Q캸 ߙ܌R7ٛ2fd uɠŚ?9΀;0|SRG[P29cW/;nPR:[9ޤstn &98]:ۡߡ[Ը_0UyQ!2XJDQiJ̻< !Zڮ/\.%6!`B`&aҁǤZ{.Am_-r64W ' eGsQg*HFDK<<\&O5ymix( u8P5HwDaRɜRaf(ƝVc   $@2A X00E/baMG㑁 h9NbI#NJ I"ci:;?>3 %-r9eצP%,̞Udru}2)E) cI#}Ճ1R_oV\ *OUyvkg䲺f Fse~ bwqY,ճ  ~+4!z$n#]5 CaqyVb`X(4ɒzdz1{WɺqT֏Z?dӨ s gJyZBoÑK]ߌA{,c Z_t*Zөޠ%7oCw?| ؠ: ᢉZRBAo()Ǭsv5n:{spT6 F.r,NK=SV2ObZa1p}B@ZYګ+𔥨\zi=E eoLRKz:T=gӲrlav9TX̵Q*g er+RI?$Ť9?@W~1at2aLͯ0d`S~yz*ףzY)y<.=_IHz8 P5*Cz"H@6eZ6".P~(]4)󌵑)W: %S[b&DZǡF35sf vVk/;^s%@c} llWuXǿėF 4zvi*z0T1m~;9S.W9O!ޔmVxŭ ѵP?)FI4w37QmfгwUަJ^r^}E}MEɨ*]=Š5>-8Y>!IZg[B~ga/f^=SQ6^r~)GW|]?ЮQXkU, @!A:mi߻πhI5@p8Zp 41ݧܭ#r|$;|b+U}@gօQ1ljc^KQ*K5zǽgU#Oz\$9\V!+u ͒N"[}Z&JO&WlrCq1FeQ陖6D%dx|ޅLjZ*&3JeqB({-9&zkhyfO sYtgih,>ì}/r7(g2RN~~5l%y jVO: e׽<㌔V2ˢ=r9'RHRҽ\-|HZ=5 cIt$A"Rϫ,*,8m$woe]WelFg"XFg":ׇ@DLT2 kt2$:4`8œB#! (Ҩ0Jc8EiyRFP 1P|;9Ћkۋa񇘮1c-3}Ԏb<#??(HD;l0 {܏o BP]p5ٺ3?8^w$d.sP h~E>/ eyWp(`;pe]׆[ǰ_.?:,rI"Nf7cXNJ.ma8s0lp?1^.?Lǰu~Cϟy0DN1 6laN&a94.w$6)A\]Kvf?k}#خdl{aꊿF<3Urw`zn\D_mJ1T*-үI?>/xP]w ]o=V}X*ހ]ID8 aa,AȻ`pށL~s|> A ΨP`DS+OB9q@In!)N;}S!`;p6DĈ2N8q$F 9l1H"x"Sk7~AL%LbU8pXDZXI a R8?L<_T/Ktrf_b\ .J"Ȭ8Y3$L9l aaJB.]Z!9"="EꘁV էiE?H7Yx:1,?NDĘhr 4(JusZqP8 0F=K**x}7qY]u4aь (ʔS*^O cflx Y8_o? 1.il易 X *3vOW<|D805 XM 1 f\!BFm\|6: I)쓸uێ(>UMu"KO%Zۻ;F>K#7W^{,$T L `(b‚g'R#qmUucrG@Tun)¢N.5~qeiQjqoR훸z97y=DM|r7? P#0ACT]?n2iOס?nԔ ԥF!.| hm`rq+̧y]iΨ{rSSmEViUL[))FI4wᛴߛ4r ]Ͼ+ߧ,_kߤLqm+t~T*>5`_I[ }Zw7osc'x]:,{6RZLz{3j)6R+qnb; jiHT eC>\M2 jL&)tF/],+z[5^I,mI?v~uy@_i%)=+h: Fm6-#0i ÔSݦB5A:itJӌx(txR2%)zhAEǸ>C 78 Ō93˭ĉuXy ȳSzkLLry"8y3jg>MWL#Tk%i j'?T@;7{NN7ZaߪiO2-VaGkFb\Z]سVqOE^ǬȫyMmXaA>s@YEMonEì̫Y*cfknG5UY`qMGeeaAZx>_ *{z~ҧgrY?2itFΓχe&c:i!x0k=2b@iÜhnD<#7C= Nteɯ'yo3hL?&X LƫV:x;!*:ΐ$^5jxT~8!,$S63H6x4k6{,ʹh;8J -Fnw@O)֚Η YlM(֢JH1 N9ťoW12#XPvǚV؊})`]GZ׊n]ʜ]6T}XU2?v8w-D̯df2>LbF|4 j-$* %^F@HHLBDN6|V@8a,,r!c0&S:$ rZ0Bx`#6Hcڰ')lL+ jbKFb2$S, A9p0I̥ m؎`i ֥:0U *MFUHydF:ͻat/6tWi!\T־+9ۜ3ſY~(!@nA ]ne37Ojs M#Ϙ.cLq4G^0czw]l0͋/(eNtE}NȫhSEJUGGH2(g9sp.a~G|G>%k=׾ )۵l)Cݧ+µrl(Mٰl4Dha"pO ~!ZZa%X`GPôr[lq?g]JG1!06֊ꩊ1xN_ٻ6WH}`ݼ6vBcrHy ")IICIǀ)i]SSUipҝ3vN̠h$a`dpY-pD)5}H!UR aEFӎ3rvR=1+u|QC."@Ren>+#GiqyRღfENX2"Х\>z^1Or)e4!1 "Rn-SBQԌh@SN:lzrSbIߞnsSF+"T{B(f$ʨb HB@ēU8:QOj738ݬ9ގwZ1iu#jJF08̹`,"08 jjmeizinEhMBR}0{LE*|ck"Qv)Y*$>(SRNc*8i!:I ƍ)ٰ؊^<9k+(U:D!EY 3+$/y(8j-Ұm= p!wY (&x^,R"o&bTD2'T q_$"ZD d@#ItKj1uRJMpQ2->aIÞE՞MC][b3وLX-cp AYd12K u+HwY.-Mw\KfY ~lߣ.dk+UyC_-I>?]50_Ygvam~6-935ChH!:G:od0 ЇѸQ0i%WDcMUmyȶQ[#.N2FBK] ^1K g* -J6t*'Qu6U378vï?:͏oJ~8_ߞӧLɛ9:ꇵ@I^0tn o޷ }JzͿ5psu_UJ?gg0? l]5[gٙ)QqU$$tg Pw^:hՄ,4B]H廨`q[`ߝR(@wm*r뽹>2x| L3 &el 6`(nNIK@2Skilp,=:[cR;ĬW :Xn (˖4z҂U;wїp$-Qlf[KS߆Iݯ՜TZC[vc@<' EnaV,p:@@^rlQO+L#' kxC2sL(QmGs+:u, Ƹ#x,N]$JOF{UF{ݕWٙ ]4aTgf%~f /PR fatznxy/QHeխ1Rmݼp҆?pٞ+u'[!n)zcNa OҳT%;I_O|p "]Qp)>b\zǓO10ᗏS{1Qps/6~Ta b]R&(`I(Ұm9}xUa/kJxY( ½Wt/[5iy٘Jw@='=^eK_9:U8F͋jfmt6n|do>˪FVB-*{Gpcv5M^LƗpX?=^wQq\0L!g~T JӁd[qIr63$[? E}Ux #O~3{c5m[?7WD_RMѻk1FBJ9_ß%ۗS?@w 9:PПd.D=O݁`ԜXNi^F3V-fSv%Y>?iR߸Tw2=z׃&'I G Q(\>73iP46S2Hrk-O^Me׮g-0 ^nmN8 yS7S/%%DM*h%Ѱ IUxOovKݭ-h$ ݼ`C.[gtC㓁E/OG.KKϺ:~d⁅bcX&J^^6L-,$S[Ԯan'Y.t\Y} ^=%-kur[Gn_mb}Bg&{ (jsz{lTlsa6ngh65KZ&n籽[:6~7{L+E&(Hqh6 Ԙz[NPiX'"@'=+gqpzTg~aIk߁\>~Z9tl{}`7Qt}>Gn|alPv,ouM DH9XG\ON ^<.0p6 I7RPsJ^?@qu25wf°e&#QHYu2LwZ `ZFL&Z i,t4Xax矏#TX%!LIũ3E>HneaLCRTx ]QQ@%vYI(J1,,VJ\;#gI(Ifr'TRw'wJ }麭gjK ^Ȍ ^`AMA\^n?:RRǖźf lRqUhAn᠘:JUhd@aD%$@tY\j/#`{t$LEt!"0]}FY1 3 r[RY0(rɃ\S&a7zeɘV!!: $H[XA`Ir,a#ֱ.de^zIۧA(ĤVqUhWm^;c!?Bh3 Ƙctr1FGͱ6sFϵ,wJÃ&>4y\y~x:x`_gk!>K+!bJ{ijIРz%]he,:u~꿈 <`s-O!2<7+wHLLryå <3@S(7+j$AVqlI2`߫ϣ|Bf/*@Ez$FdpWu&Ep:EErJ+)VCu{;r7l)X <2F*8(DȂnPH#gk! -; j G&VN>o&pP߷{}˲ϴV-]Vx۰&r#iU.D]~a&a4ݞ* b`A5YRjaF[vM{]Ykz^h&sR|]$u[:&G )-f /ُRINfat b:Ki6\oȖd_F]&ܬ,Mz|dY=gT8pr]%bKa][CFܐ5Jܵ$I%1 ijfؔxzM^iB<#"g>%H] R$#Ij$@4)`T(C03'E(OB9q@In!)N;i$*?29;FLq9_?{WƑx&@F2?7 ` X!)H|}W=Pš(r(8dÙ:~U]s&AׯTl9kuCi'c6L>&]=Tf'A/&b(J8&g)eo艧.YgG+_]ú<>4ٯՎӊ?E{ d8)| >GL"%ܕK qGgUmF4[cEUifu ,Ţj \r7 4}&ƥQ;h%,x=LkYJ𧓞80j ,š6 2΅H©P `5Zks 2%y,3ZFj\;Q}]ؚj1rѵ7{[t1 jP !\j#G)J N'Z [Ea.s\8U>JB BiFc2QV H܃#\O&iwsoɹ5tyKC?܅ T\p/n>C&_py3a]-!+ZP+Mh4K]<+˥J9`.y֝6 V',7N&mқZ6N6 6l ;oNdKֶ ezvS;$-4v_`0nxH(qE{ugcqynyǞfI2ALl@I=Ua6cX𘰙Θ3VS6 om8*>Ńy]`3m}m[-NH7Mx7b=ԡ4W_N7D?, sr $:9_d4> Vi3 C.ݒ=M EP x-vZhTr )Bi"% (>Csx8ó-/,p4\:ӂ(SJu !%кw:i,F`"0PR?QNFX6 MRNr[5G.*ϸ1D.֚3Nփ a`HoIޢۍ2N[3Kq|BDpRp=KBIǨ.rb oYZ#gC9g fufFu KZ+ )X T!(L)RJf#^:#S*v"Ц|~.?S'| IRʣL\8ZjI'K8$DAVkPOn2p%h&Ti5O@ٶ>FFwqPYf!xэQidG@ ȄlNڋ mףX :O)|ȶ'Z:e/MV"q QjԹ oSY IJFE#3N<9+ۡHGktȖC 8Vo'Cyi3HD \չ 9A Z+r5Z_L 9!"C?V\ (U ŭ;]L/>>u~swl11K7HteYم@2.`8x"4^ +]6]5_TeYaap*fK> =f7YwsB7^Wo * g[yZB?#O}=F핊>'N{}bxREJ?' wupݏ???=/|3ٻ;[:Dᤉfv&_;d\w5\3U+/z,~v~}n:+.7]^FY B {ӆ>qQm\ւ* 5l,jY`j-7\Ew2T*D#4ŅͲwZ>m,|˘-rR$ BuҬ Y6?kj+ @£C}c~dWGv@YXpLH/).ɹrRdQt/}Ҝ9u5VO'i< KOJ7)3ܭ NJC#ۄje҉[gNVѧ>-Am إCLYmUټOЮ2ƗBOR Q\!HZD.j[:{d8eQ}?^(ʄg0mR6fd ?VgmB_sSճ2ܛͭ?ɝ양tO(Rg-Ͻ=!PKGFR  F8X݃'(4A[֜rʞϓ]C8pYxE<./v\X2.ԦG?}6 > _ !Ӿ*ڰh/ ~AI.4)@9GI+gnG"3_wUyf/I^I8myjf=XIGɌ—0Fw3λVto532}DC-` @rPLQ>tC-^P Pc;nun6MdUBMtE,[`8>k!~]N*kVj Ly>W;8% s|3س L @J'ƃ2d*SjXs42T)ٿ4=iDg.'B[-FEcZxS = "H Mo&X\7[C??ssAY /os*=nevnPcMQ*8%wkW9N$ ; GZqNy8rU}TQ AߖsFpy+ Ƿl:m`5 5'7^I]Xdpzu[Jէ8tb|-9 ŅUtpf^Eo/979 ή%Ԙ%ڊ34!0:RnSq뉦:\̇ķl1 1s{oӳr{3 "dOYNߟ|j0x'Y}'Mmtt|o*Q # ̳-pпMtusyoi+#:{Ȧ6<$R-(*ƁKd G!_B핊>GNbxQM͚*L.LNwq~x>|~??PfNߢ&ݚ_"@-szSíSow|߿zUȵgKgřPqUe$u qy|ڶWaz 1mMlB |S6྆ }08 1 Ų#~ɷ~2*)dA/lapF߭z4-'Ā}$pСKc~䪳Ë"Ou;uB;Y\ؘT A%tSo~0eJC?ptA~uYގ\Mp~rrmӅrN0F)S#e.Jjuy.ܻf޼5fb<-@%Պ -LտCo4X <ߓIO73_ M_.bnmPwe-&;=K7`fg4~g|9ToPbR}r]ԑ~̛ȍ5:KsR/Fë^N-^9S{O{zH)ߣemrrs_ gUi]Le>z(BT@(S|pُkS7̋0QmufYs-7zD2&ĕ4cM|,smuycP7@ :8:!h8Qy "$jt7sܢ[ z2O8]~M?]cmR<ygިzb2J=Bog ~wmhzw?ʴg6={<1~lGv;b_ 5j+ m5o[ 8 ;.ƙa?g- q3#"γ$U ^vrpaMOr@KInD:ik*@=ƘgZi̖@fўo1x+]MB&אup"Ӹ|rU!¸rK !"1R㼭iƽ>?o  n_ :L EiAu\WNb吳?V:t!>kdUzh)YFo18pr}qm~m~dS#iyV\[1tE(! QCr@$k Iӑ( Z)m3sJjޛfT5㩪IO |AN˘FDl= \aR_'APJy:Nm,q!Ri1,=Wz;IМLoer'V*+ȃ|6i[>aHnqjqqzh_z)w/`jا%:ϐ6$sW!Dxto٨%_&^R2d^F[ >{ `l$gbkrLixV#<O\}O^iNhߝ@h TRbK/E$y v^ j 5ۤ[M(+x'?2}bi[aXDbUH'J'0MPU.ny1iKtVu)(u`JAdyuyha`Vm!&tïy\`bR dB>M-jIǫzϗ^>Zbҙmu `pJ&fZ>罥BehuKY*0 oյvII{U\'B''.$mʎEFǂʝ!tN&\Wy&9n_ !HY qa,AF"U`;@.F4)`T(C03'E(OB9q@In!)N;5Ξ0b@rn,DDꥦ0+Cՙ쳑U5c%dP^5]Mv{fw -gt07ib|q9IuJg8j"ZbD'8#SAcI8=<uP`7ӘGBkMetHP 5 T!Y]W h["^|u77)~Ew8Q8Ån38v¤ۂ$UfB9 p+Hw49vcwAxd$Tgqp%F Gtۈ$8~35RNYa <39q~٪τ_k3 u9I+G2LuŵÕ*O b`uŴa9Ngm6;ݼUв=ݻݴz7yC -7Cknsy&x֝ov-VNoaqk6onTa)1o󔮵<ֹrs:y,dLW ԣbD[[W0Ap]#/1pWDo1?fsOuEX=iiEݏZv\rUTxt($a82x#s0?P0;fV؝pℳҩGϴA-L{/ 8T~ă҂i' TZ C0T V"b0Ȥi(0ars},i^i@Yδ}pA?Zgf:QlCݝO'RFvOVgrxx**`ZȃXFEOC8<{M8<> +Ed!pNdj)棱 0yFy$tBI *E+T4hiB$I%,3BRR'OVRPMA-7Oz^vܧY OS hl+Y/0#a OVȢ84+OӬގw1iw#jJF08̹`,"08 TFjmU>j:i˷#݌Cꬦ zN" ,|M$.^6%y`8KEeIA 򠓤`hzZ#xQG!J,[)auf~UٕW.R$t cAb2zHJٯC>{Sօ1X ҍUO[}./6 ^o"NyO<e<A@ҔR%w)%Z2yyrމ&L$,7Rk 0A,7xEΎf|ikYΞ~A9Tző3l qs]i]`YkKö$0ܔzBmbg")f"IE$s2JawZIA"A}H&V < *C㞇aV`Iۓh08.a}vp"xd iNXR눁DRꔋ%J'5,ipڱSy7tiz*Ӵ qKw&  ՘@~ $(=2by>Ffdw5I8qs/gU9sI< iUՃ?go?l!|.H{"a}*tIϱޤ]FIk_ަ]77]'U+~Z%c֑G[q'{w`)"'\ J1u',ÕܝPW'aY(8!zr fbF黳*,QNOyd|=Ͻe]Jԫ+EoߞNO?*tu7s]|vi6+58q"Ղ͡T7L~-ͨnݛ6ݚZm>}z"8[C=#|~1]Y+Q6&) 3- 1֏$ҭ#1~aH0 kYd > GL,h8kٿO6MQ */rBk?$GlIQA)K8H~MϘpڧH1҃}2[lpg`^ c3IlJa^i 68R` vVk/;^s%cr?(f{-UE{8o9- RƶBBY$1@YA 6 KnC-! ,ʄrVš3y @r 8M(rneZEw$Z)"V_"AH: H hwR㊲wanC|w!-aV=3%~FCC9Lm`Jn {g! UgVHquҍ6*/\(?fC[Rt1HM-Jv0v7@VWtfT?߽^KM޽>`ʏ71]>%\w#X7P)Mň>RQI< ĺL Q8PPal93+hQA(>0y,J€l%U7]Ƌm tg]c*'zNzW+~~T}]e{O~ܜ͋jxXNu[ןz7*ڮk{w趟k7WHPewA<{ b㐋,4qt!YVE\pM s9]#|-F F$H&`V!ivHg:f҅^)T0RXЯ5m [>gDW [O"a]>DpR)Z E_  Ё҄}SZwXX83ܓ'Â>H4K7(V%1aR]7^~4:XV5m3(_h v7aA߮ǝM1Inő~({Z3mue^8> fAurQ N߄>E_3ERRA(ܤK9 Q KsM2𝀬 |紽zz˷%0w[3n}ohA$ah&5mH}N.G.+IϺ:}f≅bcoX&*^^6L-,av)FԃLjnܷ i@d&kC%fUetZxm*!}a6J*I-%3mX4Am>7PT{frKm7ӆZ̺Ir~*iYvn ݄1S6QeE2saV][=l$%n1Yx$T $e4 کbP28<10 .f`rqb#VAWU>cm,>|yc}E1V5Rn\tJ\$U4Fl-<'tTcH;F1Ҟ2Ҟu=APXVjT g1E jZܣ*XZo;.~[?26٬,BY~k%ccʅ˧ Bxr[<Ǜ<"Oڱ6#:6vϸvIvHJxlG݌翴ƦV &n^և1Rdȭ;Q?ԏf^gp9DŽS"JEg #M&y88 |}=mÔ꾒)2D"]q!7PVom3)Jբeb'3ћ?;Lry͆xAܱ32;,\lۮ|ٺ>^-_ͽbjUcH뱖dE[n ը eFdBxRraET\4ئ\Tjx͜g}qj f/O[t௛4{֛,|[Wmwm^{oy\|=ۿvqb}{|o`wɃm{;__<>;߸r? m6}{n>qss7H;#qO8ٍ\sKڷv?lr:_#g)])t-E6ok=sr?/hZdS-^DY_/8gw) gq{zr>GXݲ=@}OZ-'}FxDX,#axʉ4jvV9W~ߕZ|rY_|8*kŁDKxM8zKokץ@8u#_5w*$B68qzy_u:Y/l;j{d]||sƗxɿɚ[ n:0I7x'ƓW:Pd#=ˑ3apy>psɤWZWtVso7gW \ s=֦հ2#\}pE&P\ Kx6pE \ kjX;]o31ڏլuqͦj㻓OS1CI2U qt&(~ӡK{/?}wufc^_/^ng!D=3ǫ]o 1_NkZ߲i.]9翜N?8Sw%ƶydfwc; c!3iEn;Ef_z7c~/VqRZBu=6l0R(͘^ b\=| "؝# oQm4UVI jZ.]eLI58x3Ƿz[^V'^E1VMxwlĞL4pV#X|5PMe ţ[ı2\Ս/2e,qEYgE9iE4v]]vKNsN-ؚX 6ELPX\@680)dBCh¥i, x R3R}3m X&\}+;c$Ic }+Lh;$fP57T*G f AX'm̀@C;D`PgEGI2VSQݘq ʚ-{nGUPnFxphPuiV& :i]+U]4-6ɪ˚8Tx݇m_>˕E=JzX}.e+v(`щ*."`is'4\UF$di`MR؆g.1!wfJ@)'@\ڔzBZ\9?$IO@a`ݳfm,8I5Imodx"Œd̊ʊ"3H0 ZJ+D彲_rElӡ~uHsR%YWRCfRn+W|EJ/[bcEu: !ꀸ;DV_?Q4HyDf%HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$\-Q3]":Ľu@Zz)B[$"AuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Qj:`pv# Q: HyD䑨D$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HԹ^NYG"]"Bv#K_g: WO) &: ^EuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Q:HAuD$ Qj:O35Q{35ݯ Z~O˹M 7鏐ROAuW ';"@ZnWRJ+Z{ZSByUXwYĵ+JX])Dwݕw ]Fθ"lVkwWEJ-+K y. Y_Ϳ??4Y~ߕ_zXM>{u/3;KmT6q^ AcrJSk-i^eS^'+?+eiP&jCrTR ڹKB6Q!.e"c! CUB|cq-ٟOtoJ@ !xEHlS a8K8QCj-dcIRExQ&(GeT܂f?:KC|g' !gr ?&r^`_piroߥ(ß15;_t%(qqT'HfmAJ`{,D)S%/ J @Qi/!i*'-3C"Go}A!E-z;rq U]UMڸú:4O޲.<'g.c5ىe<.xs=4>mb&x2-ik\˰v :Vto);\?0~svi4i^c?Vö"$_[ʎo&we{#3p?$ AG!*{.Aq;|m+xZu@6 àV;9B .'2wskxB7K +M/+vOC|6AFµ$+E]psKnk_s'7wqF8QV*$^H.[1W1ĕ1A:,ͧa4=-_njp4fjn8Ӣre2)S)jjEw|gViЛ|C0dGj6Jŕ78d \X% TJ겖xASέOp8M4},kU=ʼF_2n%<06.vӓY$,4a` 1E.+0=T}-fY1wI esmOOe|86\?ɿ}Q^w:D S:TZ)\) R 8R\::-wRU:(ТglMq7Jy_ؚf B%*6yҽ,ba*FI} REK7%6b}Zb C\d{8+n+J-NF!#f&lVGLղnMq)ۖvi^Fۆʀ`\FR+6* ' n &K|w$@ Xz%)1K jc[oii3'|Iw4ؓl\zdA @*h]AKjzw0Nmv0{~hB Zrkv>yxMqz^j[FfWw/g>Dʅ_1Ϧ=G>մ.q>՜=K6jKT/j9aR:QGn:olS둂#-2ݕzEZAir#}CHamunz}ݷ-.>i}&j>}H3Y].ۢ.p{3dږ"MOjiMooJ!@UB!Są;0Fa+ptyM?IɺdIiUyF $1$o`^ǘI$ךIOI+W1 ̲t˹ґj D|[y 4sI L6n15q$y*GIMQc[5nMrnf ]ݍ\Jr;֧,Քp!d,LTP)Ćs s(ɭZpTfo nKą)dp`3995c2(r e nMg;.$yK>Zg v1l) 7?%Ӥ#ZZ8L8fRq}% r L1߫"h*[gi\X jnQ;,`c,'[BՆj|w?) 0\?ʑ.?w- p3"γdU ^XD@-!u=.iIdq.]֥H^gmmن"+!d`eZcL{ ?L+MxƜS̺aaijF~^64sG[oXY mtLZVo2?Knc+>/W^XaW7VcI!\̅!)B N K4l D;iC`:EA8hK}2mg/,}OՊ~% 4`BD4 >/ j=]{ >nFHZGR̈́yv="+v}e AWZl61XXpÂGw1,bNh؇ׅ~~XZ]mfO>,_f0MG3.fy!/rq_nl)'ӧIԧMuY zuoҽE{cӮ,M9j;̒/CAJ8zSzh ߹^O1pw@9ݖRŴ%*5\vy 7ZDD|9\B Ņb06DW'-*}ƴS}B=kJm6~gEٶ Wli TG/wM^nm =q8_{zY+Hm-=Gn3?Lv)5 ,RsF`-tI"L3 Ì$jҜRǾzi%.;Bɺ:Au,Zv8p{9.t Eq|I ko.ϟ&n$:'y!3jV=c\kʴqTT,bTVx^XBh>MOLT6D/+dT7Ioz}}kBIz[2su VWȫI(˳C%&U I8Q|iל6~|gLԵPe;-q:2qP4s|90f"тJBe4E(})!8/6B 1sS8.GACma{aӹ9ɈL:d Vi "0^rtO`6gam 58*BsmFh|CZ|?6ԍ1H+b?&XfR#.Ri/I%՚T$x<pq=QabʇL<%)N0z%Fg4=/}Tym)(D-ݗKk|2);:c*WIF&2]\b~* ( 9Urۚ`v܂nu`ܹU| gq7O9j lXCQKl`ME`0#51gIH{%D* rp<5i4ע>OT\ 2 c\. B)Ie.,mI@8`#a8dmo\  ,Q-LZ,߯z(_#ưej]]U]U]/2$IV\)R׊ZZa'׭~f"gA1a*mIY.ᣢ Ga5-hqmVc8qYSxEeFVըBV x ~[ l= -5gAخPfI|ȶ'&':ZmFKU,qUe4NX̅"$tV[UK͑T+Cf_1 ٖ:Tԡ^j'aj2 E*A"LZ_% F ; 湞!D38 B5B1\QQ9R>.63+p]v>㻝Մ>.׻AIgɺLQ \U:H3uzʡulrj{ ?2$bJ~fzqӻGzj gv^_GշTQwzY7`Qe35o|ob7JmOio lsP>r˚/m60h0ڐ;[s@Iet-5غJߡ">kdHXRTs}b̆5@#3w@-?%oz=hZg.'B[-)Hy <$<3 ZBjIilܤ0)N+4{?U6?cjs F++W1^f1;Cf騌<Bj+~Ӣ_ŰSM*ĉ?:u:爎߾}߽pJ9}׷~O p fwᷝd ۟ }O5J)vW_waȩ0o&7Ź^ɒ@!XWrĽ¢3,¸ bZ\ɂp# ܋11MNyɸ[b!%ĀevjSW:]Ybd|d2[~J N331L7O.ƾRC`@Hb@>c X?p˿"Oe+uB;Y\ؘT A%TS7z%p"Ө3 Q z~wLmc<C4+M&h'k }PΪ*m΋ aF %]!aW\~01Z1JݲȮ8jO 4Ǯ`GaWZ]e(-m dW9᪹VFnZm{1t_JyplϪhqjcszwpz)P;`g0*D.a.#&ۨ% ޗ+'Xe09 6pމ-)-e3z ,ުH/D B,~Rg;a6&w}L t*R)"6K`^}DF`4 dhwdT hUUFl<vec *C)[vٕVC#J8k+VP0Ֆ]veH 9 v+u83`ZM^Pz%+KQ,8!CaW{/]!vfM[vܞT<'eH}ן$w`2KA/S:"%*S&;Cb\z!`lHRj'l V@<霢1D'B `{lyO>Ɍnz!炘.BMˢ/'5an:Yˠ'd#1[WG5 GyMxu΍\]t\ZRhѝ[ܶ5mPA@"}m狈|R&Si[nԢP!*eƘ"LzVD4w„ T#^sfAeUHp,iGьKn&&41pV( tx npSu_V7'[e+Zힵzi{MuU,B-nr 4;3|w+z m8Aŝ-x4 +Wt S:TZ)g0(8"N`K㝴TGrFet1yOU] Y1YϢhRB!*$RlӜ1pVsƇgᅍaV&I^8兏 幍1d1HԸnfs/~ݠtns3}; / QJRBZs#T҈@%JQ`e47ʱI2dEF6A%(uBu BAa89r7lSر-צ smrV}wm&\FR2JT8h5HP A <\J$ T#Aac懕Rt:2c~lYM0+k(!J T[AgҔG-X,Gj&Tri*BokjM-vD1 #\f8EXU3^/q+w5Z?~^FHΚ,|Z^o rM4DR̕S6^^8ۧ, 9!BAVׅ"qb4 0d,Qk$̜)KSg k=w fbz#{׋E+|QNTkVd_*lzՉ)(O_<8dNȅs$1N =.O}-FMj+LRP<#F<ڀ6`h}74g{m`NS}WUcə%!3LZeZz4&.i6z<*ĩ\Q[.JYŜ` j"Q­m1p֌xJG?m;'Inc7O.1]M!>llwbFu)esԱ^ 4YIF)B2KITP(ꆁsɂa(Ke;pXugf85c2(R M;4j ^(g}tdS&o6y_+Jy;_w%m#Y34"4 :mmda(QdQ,% d|,{U, SXMY?~=thrq!WX"֮vTL-E`ѹ% %UwKnO(EKf-D[eu3Ub2nb@'99&r6ȹQj"1HZ)"V"D4A ]f6w;qn9t8MXnW'Y=p_=\jb՘ 7Z=pV|wo%\},]NM\`yDd)>תKNzΐ Z n["6Q $H&`V!ivHg:f`MyH_[" &дfpXCáw҉vE+z֓E ۛ[Be VS U&,KxYJ5@p nܑa'&qJ.xEӸn_^&}FR٣C P`t勞o>4[2|+{y=z_']ȮfnǮOݟ;|TmS-;&֮ʡ4 w*0sVnRn^'hUj, |9*!0wEt^2]u-9ƱB6a*ryw++$V0_N0+p, rT2-}D%2Q:<+Pv<N|%iYaº4Ή `\-1|4|!:/9Xs6ÃN()ya`"ZA۠FH"I0( fb]lٌ=&Gpptbh,J]]y:xsnru:ҵVJFqvm:+* ZX&_v0 ,0 #YbaZLy+$wMvKp+gjL"")MzqL87 Iw΀ 2zʙAPI(>DqdDԈ H!UR aEfy#r-嬒/_Ձ%k5@b9"!cc9-՞@sB45,rb"pTQo_e D(gR&J "2% E͘Q*$cBlPˍ^nsSF+"T{B(f$ʨb HB@ēU8(5V+"&n$W [fG9eRF' L(`m-ߊ62/xI" ,|u$6^6%y`8KEӡNc*8i!:I ƍAso5|1c֮ 8dWC|a2t;2ˬ E* bCꀄ6du,FRXl0 )>$S@n7R.:+ƇJ@<}Fe M_S`o"NyO<e<A@ҔR%w)W|z7 (WHlB]c0Qʃp-8z Ϲ21`_]JI/^s#="V g4`Ӟ+CQhזM9IsJS%Z,#)f"IE$s2JawZIA"A}H&6pqONJq/C/Q}}} a-G2$:%xO:)&8(F԰aOܢjd}ϵPiAL@6"VK17D0*;#HP{,e@}9W,dw4I8ߺNٲQ};?s>yҪʿNO[$K)ҚeR~b%şt) J01.uZ{xѥ>Sf "e0&hyŒ;Yi)*}*F"rv.Swfpmg\S  ފN[lB[rTa531#ÄtEUXwy(F|7>pW$-r)7R%Y}S5^^>]ߜ*u959s W_`L#l0xR0Xxb25]Zmr}uxt5=n Fsmbsy5Yfvn8)+ӛMm70VI";]7݆4߆̳CzJG YL=|g3Zp\5ޕw s gjZBҍ9.=EZ>gOoeuLWoҏ?K뭳 QU$$tHPw]pEgX%'DzBb[ )s)lq_Es>r}|_0|}B `@Ӻi)Ef{s}E,/1[Se?Ƃ$E27k0fT`'ic UXT`#H)8i\O_m;uL)iCNDpSlcbpD-NKAo()hu:Rk#<spT6 F.r,NK=SV2ObZaU1p}B@j7lBU8q6J‰wي:ЧYĀm8Ԫp(({8T=g0=ֽ$^exMrFQ+R[2Ee*l"|dslR}|eԓ'Zɖ C1ғ}V2Slpe๞ƴ9f( C!flp`r^ wE%K1*NM8c`*WY*<8۸j 7*Bd̰gEѓ.W?y7}z #Ѿjb=nѮ-:Ps;W1a{k>+ mm V!L8L_>PreJ/G1"@ J/mPęARLZ0 ,s9OblifUsbxƍ)W6Tw^^QiT{.|6 MQz)ċ"V6R8_}M2|pjQ-:fC!BE荰} 2foZC"=$' [pOnaW/)}UͫI_/p~q8sGV{'>_A5J"#yG-A4ϤR?*ab]Q&]8Ul93+n @@BB &aC8 U]đc'v dg]mB~MD1UFerP;/>[T37p/?I>GQE]Y3RoNI|;z9U"XQW@.ATUcWWJ[u + QW\NE]%j;vuUWߠb)OH]zѱD.gJT*ժoQ]iF֮roY.E+Zo p|򣼲w6c' V:ZTԁ!J*WS­ `صݱ:v1PԬEYM|$[%٦lfxl,6ꪲ}kq[׷p <d5g¯0uczr1l~r⾄Y~aذf# w1A0:y=zEa}׮f5!adpӮ[;3/orb#Cͺk,yG`BT&n\ݜ|^z5c>~;t"{ ŘtZ?k]kʓ jיPMHK}0GNXӺc$* #h%ѓx,mn::x$4K9q## FT8G"į A*!lhڳFΖr}dzWaT/QJbTD#䴤V{3C !|Ԭ U9S^ 7z =/SPΘ'B9Ô2QƘ))T(j\4R| jlO/?y~f9'<%Nl""*Ng.Մ6FY0ɶ2s0$h5(^^f/}{7H8̹`,"08 kjml-AmӐ"`͞Szc YLm80mJq(TJ)4NZȃNqe*Cl5bc'A:z㐇JY:k ^ɰ:3>Cj HD i\չ ziOoIS~Hsor:7KD:`r( %ꗹ,/6}u:m'9xy2^pxx,,1*D*K- |uSD-< (WHlB]c0Qʃp-8z "g˃s>qG2^;9=x-=\WZD d@#ItKj1uRJJGEɴ6% {oUJ/ 72Mxga/ZZ! ǀQAc,#cdHoAW Mη N[fYU`bſ1#VU9 gr4uHT'DN,|9tIq_Ҡ$q/oS]nХR "e0&hyْ;e E/;KSD.ѩbLYs;>*gϏX͵_%P0 x+:mq oY ˥CRĜ\ 2r'Uͥ$Q(f|01>dEUoTY%Xw_|.zxv~q|\Wg:RGnApvU4ucKg`Bd*asv뉚Y=<2c R_X]x9;k>p Fsn~q<:=W?^&E^~83|iǡxZ}'ٔYy'W66Lv&|1T: V0bb|.t|4[p\uޕwm7NʆVBҍ9'S,A{,&u];Z &XTOlTԗ378vӯ?|:÷K???~ ؠ: ᢉZRBAo()icp::j=<<spT6 F.r,NK=SV2ObZaU1p}B@6zلګtxⅤ?d+Sz^ 6ҡVCԵWk:Tf0Ueܫj 4߽&}FQ+R8HԐD0ɡgЊm[;kcY{d1!49C} 1T)b+G V,jRz\TkncH0 rgp-_i.Ǿ6ؐTcQ, %Ք>8ʲN=iAzg|-:G}Ub==]>v䃀Q}*o> t5X~ogPj ;B飋Yb270M 6.YKnC!9 , ,š> JqTQ( Ȣ`;`crX+"D4<,34Ua:% EYiM CY1p%@7Bq|n^ HUս1R纭]ݼtSls75=JUN7B[WÇӭtIN{t(J_<7K灂tOVEsЛw1`zO;_ 簑.†) ؅bH##YtBN  }fr6Qoiz*8'lT짆C}N [q+uܻ > Ęd#UqS1Y E_rm AkMt:<`]`AX=w0|G4n'I_-w;Jbv}VL3۬L?Z]d /ޝu{DȢ nDՂkԶ3ոPq3/ɳikr衄&^} .f) =!"}Jgd#।(Q"Im,s$J!7, -_v%`ճU#O"77XPUWy:9Pr?rYA2WH `>`.yփv3 O,kbxa`mWf'MDˤ Gv300=L7V)'K ZnR|t6}ie.M )򺜅sу_Š;iRfqs{ E91z0(7y{e6c\1J}I Ԍ000 o3cJ)k6Qe5E2csF9nnXMqo*0։>ɀkmh7}SHwSLՙ w_b+[.t(% smCe#7~ |4~3rGO?;4" 5 &cŸE{<@rFϵ,wJet (OCf̼ gٚ"H8aJމx fDI TB]I-67ҫov7* $u1X`d?mMZհri\[(Yb6Jz͕ǗA M۫Eu3egoهꔑ˺C{V}iSGyNtr5J]F7WOBdg`7Z&*P׫+nq/c`bҹV:F j@/]:(~?t$*8ʝ7%p(9~W=*t9z^<44k!60ceɥDP&&10DALKM^cA?9`m6mXkVkv|BԁpX=5"qMPLAyiL.OXFaBdӂzȈ )yTkB$^4(=Lt"G6#g>i5E#6jDְFdFl5bhJTnkǙ$AKY༆(G=: mMDkێ:TJZ+i/iG-a{5#51+6bqKu}xJ y}ZRކn9V`˳inNT~'L`~at:OƓO4_6I㺺[N (O2cGGk/quӻrzYN:Ot0#~! 'Kg`e td\P'Kh DkeI87k Y\3]#R\V;qzE}:bIrD;3Fu$y,;3L%'>W 8"u >Bܹ\]ej@*ikTW)jH~ |?W;aκ] E<,9eAzƲL0PReDF 6x#M`9iB.D#ur.J :/9ƒdJ3݀]vGr ixş>hƛ\4ݛ]1=Ie C ==j<;m4cwv]x*8&g',ѠLǂ2Z:zThkDRޜ_o<]mq3wc#uZ#]@pBR/'l&X IcxM408b*RjW;N3T[ns"$u"99-68 <"23,z).S h~C*Ww/P-J7" OHF ,@ hֻM-uK)[Pm[R)v_|U5:V0"6WFÈ7RiŝH> n$jeҤ ~huhqxU8<,9V.OicQ%h:rXh]S޻fR+@2I9E !1ŲEQI+|ۅ#gR gvpt)ڗnBk\LhI =݊SSdzy!r]ʺ1&)]QuF?eHQJjO>UFaQE+In19Ip+g3nmL2%ˮ#B XtҽvLF̒dbPddt!WCpRp=KBIǨ.rbY#gC9 fW1*^~ YDպ{4c)PRxrpB!!j;⥣:2b+MK@O?3˔\&MF Re9R(H2>ٜJFB(j5wHZ?yafy/C.5ɤ7^EL0xdJ"E%sȓ6s-!x;f4kHfˌ<Em1NphFƹ2D&B{ h<7EΆf|evp~ZZh>*UЂsPk xQo(vdC@rq]<! >E{e>iANsg,3%f&f E*h"JD(/D X12nUڳsGAwX~F'[$G)W6;,'ƽR0DB}ۏg?}>L}ۇ?GQhNH`fg~݉ ;`>R3wN*}V9[|tsaN^Su3:$ނ#]NuˎtQ;! YNװPw1a ռGc~*AlŸbcNVͭcǝw8wce0U RɂPutap F߭~jᄟ-}WRMQG}%ɅG`Жߐ:PV)˄njr>d79$5Di4'mNSGGwLNLbEt'AGwxMYVZ'{.B5uPč I'f)SzV >lӡCiMj<j:g0}/geX_yÚȍN2sW506clBsڧ^waӥR^ hen͆&f~,, Ȏdfwm6iY:-i}PH*Rg-ךROm:0Ѫ;c$:'-*1i0p=@Zf\/8wօ\BmpSqη]_l 'FZ]u~n|Lj{Eݳ} mmM{73RV} -BFjfӻWea{q#닋PP‚Gô戀+aIIO5rxaMOr @ Aznl $ TR$2A3⩣uis 1M2;EF{Օ (;f|mS:zǎ:FAǿ68q%B?k!@*VjmwOT_]R`'<}L1ݫ>*Lb:TD2Y8_VW{9x晴#oG)?Y8h!XFo[eCÈaarKnKZÊ+kyOQJJO6Rz@{a`9%4[PzoYFf}Ʃ[dT㩲ȩO ~l6 ]蜩ZDl==-\hnrXp[,,C nIߓaGG8<ӎ=24)D:e1лAHF;ɇx;ٟ|7xjQʹ˶+Sur9m0;Nۭľ;N'L[`>KKhҋͷ__VnZ9H!r:#h@P3ę6crT@ d!pmBnodQ@#[)Jaw{D [& C=Ex 75ӿ)a7IJÑ D2[lsn%FNYZxbXY D S Koj0;ٔ#Q&iu ĆyAdCֶ ezh0V[nzqUfbJ*oQ 5gЍ8i:ى_^AĜ̵^;6kfO:eaf"bmIn>dAf&ֳ76̳}tmf;1'c6eE6={b9 Xuhi*LDE_ۢMmhꞟ6JMRVfJZD.j[8 S֦ME9 ^$눛C) ,:B3_]V]2dK3 h ʘZ2ud{xc+݈c^J+@[(222Eӎ-t./l 9riO{HyTF9"sMPQD A[SWEbQ~%A5,FbT#dR_xKl:g d UV9kdN5 g8k{ 1 jH!K o?Dlm-"|Vgb[}eb :E~.ФC_> Q.B\ߤE<vvн)~K>&Gl62+ >y zDb eHч8rYJdN 39x'SA to5?)-uyrC24X%L"uV ,c^{Jf`N(寄$%uP)lyol6Q<h(lkJ֯eozWvJb+ᄑM:9mgp%7mH,T{ϿJz6j711BdEJx,`=^k _;7I`RQ[S!2zMYrAV]c94,o؛8=Yʳ4c'_x}zpDztVlNC2/;b՝5hd=vYĤF: P#d2RrDC0 ( yIt*Ywת2uC> &CR)nAd2=f]&2ohn˃`Ҏ]6kkG 1, r^Hv `rI)vgx^a4`BȔr %5 Y%! Ead@5&f؂EՊ#zDѳG#VƫfLJWafCc M73a2T=f{D #$r76$/a9֓f Mez, k75"i|D|CBՇG]zP/ټSknt广krgG RjX/}<#+M{ǓѰ|i&dN?|`$KTqm\˷ߍTؾ60"nH%$L N,2bL $N(9GQy##Y*]tNA A&Mer?fmHLI:s [&#[dol'j|l~yLr| wG0 "zn٘?um"]?';M F)&]3IPB(r6,'^-8:r;M9=J$z6 scPQKOF#{YUoly+h4kuߜ{Wg.7 q1E7WTY%sَ~i_F 0mT"Iuj :1c))sb|$Bf_Vc6Y;kZ,7x %߾pi,X,XJSktvB6Ki1D61\1`B6kB`\\h1GԃKDIc`c8^ : . 6 lPi<,09٤C)bή`_:S 2wzwvC?X\ ;͉˺+z&e tAETf QR{kLPg ͬa KNH]J̡!* v@TR "!fJ 7} E91E;cQ1L1i_7sDǘˆ89 ൑;f7q(?WlArI5XzN(m7qʷ>J]r[7?[<07QNmW6^P mՆv:$4$hKg((FלRdz<5ZnoPV3.<¯΂1QH1̌`w+Lf"JD::!FOeNjge/d"#PPrA8aǤQdel:dɞ7q֮xf*!EOaKֶۘu"'nSΊLaU rHrHuB'ܢ0 i^~^ mk}6q%,U`Xv1R%djj!8yÞ.QR^p*DnVf} - yDdq8j7xexsՈ/`@p< K)Y TFJhsql+>Vm6ƦI}0|xȮƧz6RmP6kB1HHh}DCixB|RB)~AW=̆mkƏͦvGajqEuV|m/?8!$"q@eR FRZ )A8*ƼcI2-tRY?{D. uHJ#%B)B9CDs"{M-6,mjx£7k^]o0&YLNL'{!)#]RAT rMFo:hlB)׽lfzzI_NQ]&P0`0Ma5@Eg QdXЛBoVHqOq]|:]\>;܊>&ʅ(1DWdNːSqFbI[UR'XCSzO# Ag K)(hCJ@ʔ3aRԯ!Ƥ;3)rߙZL1K1Y1 /R7"\)RDNq:}nY0ɗ +GKb+O'ef2GuL;S+\UI26)q;>5ZJ.㱟L2\\O/5sV3@ G0[v:(Z\;RVtMr?u^pZ\iC܎ս׃+ߺy/QϾ]7^NNL/>Ըvw;ER?8nӈnǸf+(M* whz5/Ìo;\kw_fӳo'g+єs/_·Ynz@)Yqpٵ8| a̭w\w6]oGW}0 ,8~5~J0: F0bej애k÷rłSPKH|7Y+Ҋ >ƽQtZR<{s~>O~|s~p3L= XLpD[V0wO ]ߝSIonYSٟo]?wu7WlIEZ?!CGr˲M?z@b̅mEhwa~w A<4 OqoͿ}XaEEח1[S(CI&eX<`t`wic* *0w{o'&q4(_SǤxKQ8D7M=)OMԒ$78EtNSԚp {U8*y#9])+ \10I quPlBUrk<+?f+c>z6⭞'ފb[Փ|CwEnZQ4kFIQF)KP1KML /۴^m&&fQzp<}6 D5ltk Mn-M &&߿osZk5@XƶCZτ(zNh\#<+MVB0@74*[bGſGigg~ V vN#ϿM"#'_~xevT#a rQP&R/`hB y9>O^UBB,T &+LJBis .y>,j[ nB7T^~./OWuu?;'l2K:^Y?aVgv9ᦒqw=M椎'~ɬߏSϸԴu~8A1J6M\0V"l̪3N7!At>mW?ӅsHaÔivB 1$ ,RL:,PvkAfr: _Cu0ɉ\SY8ϒ}j/҉.wS#~&XVkb OB9b~,d `$n&3I5@p6w6s>Gd`[2Inү <)I\LUQ@L7O-7`^nekmg``.y֝63 V'O,k`(xaj`MWf'zIm6B4 %kۆK2]5Z],ai}rKK[%U,PYHJ*)ΧW?ꢖmhcBe>PT {yfUjY51InO. TѼ ̈́zcJK6QydzMG9:,^`i*-DDC_E_64OQUq؝p aAi.f\KR/-!TJL2q"82ۙyfr6 Lp}o<*Ps1n"l{]z}b^wjt/d\"vF~Hi8iT dV0bLe9|IK^qy&/KaRh.ZM] 3/2 +:\<2 %8F܅,,{X{RD%e]RԈ18; uct89XJ2*Qe^Z,Llif9*yEcbtT{Gðm}>Fg Ƙct2|;%MD ,uW(>k/upͫ^?oՊh?+mk tRA[ͬ^(NfI 瓬YEH(򙖜eN)bxi0tlZ B=ZOy)+*S2:;ܫ66g_S,`3,2dz/gHy@ Lx4'HݒGet=D%G=>D `֋zPI TWyH JRu(*Q+~{ Օ1u@ ƈJ~(*QDzJ3<]JS~_^}ˏvqѱ,V"1.il易fXGJNԜEXA]`(`\`q@%C`̔syx30ÜT᧧h!LC@Oz~A() r6?)dƗaš7/1vy:yD>L«wfjC/q!!u X DK҃1}ZJEvH'0bX]0Wne-v(+z{ M#Ϙ.cL(2Ƒ̘`R^cuBIEb_=smQES @EJUGGHtB; E +0 rǬJnxfr^ Խ)7.;9#xݺյ?X/JFq:sڵ ZXܧ  -8%wۚ䮩xىtH c(hS9΍$`iFsz0Sܪ # b82 RjDs$rŐ*"uk9k 嬐1SaTH di &E0G"$z|!%syRღfENX2t"Ц\}-_) 3?L1Or)e4!1 "Rn-SBQԌh@A"06j j`_v~.sNxJ hEDTjVHE,d[Uz9 ]xG+cZ.CF+"&n$W ۔aH\0IuS%`BI@k'mmIRɔ,)|c,|u&^6%y`8KEáNg*8i!:I ƍAso5c27_t!Ѳt:˓aufpRb"hMxH: 6i3؄olW"xdכwzȜq0O0~;I1#.U:YUd -iy|^O'Rn3q%fhxcL(moR75_ז٢>P0&h3c0ɰ!wlr@nQiaǣ>KzൃuFhp  x b)F)J0R9QK$pF"j-u+¥&L$ R:p7 | \Kp8z EΆ|:Iôz}3^q$Gj% accN{@ާ5ymio tp!sYŨY^G*#RpD Ȣ0P;$$"I'zto]nvxJwU:-$ϿۿUs[[]20 m l AVk@@p9=y8r'׸crE0}i\ `t;=mg~L tv3VTV~SRKk]oo]NkWIfynOݔ@Xi룄7ʦ?. fΗ)vsvSZPg2DOf"x))`JnRpŜ()9&M@ R &Nv8]l [[mZb}7%kw<*et}<wY  s).bW/lX((&umʼnErL;i*$NSjnܵiw2l%7Z]uJ*;%/^<*sQZnq䒆_pT~oT70t\ͽCpCmto&z7֟6:.^'/E kz]ǡcևϢAŕkWqqs䬕닕6JU!^٠|ƙARL l-'}ȓԳ%֩gv7Pm1šSG\-݄ ѳxjaEG•:Bq*1Ǫ2^rlI WZrV9O/FIo =3^Rgq^ܰxF_w7o\ L$Ϳtus _6T74iX}.GPb\ 1C 5gJF( | N9ťoW1 ,I&i"XlnϘM͞q=J9/̦{ / D.|SXܧ_f&`8äUNgR7.j YK޸z_gJ}5٫8 CL* 7]dm^_ OfGa.}͞U]Q5L4_vL^j͠7l+A; /w"]Jz5x_6voSQ[˃z899O6;DJ!* ,% B~uχ\=(;_afZvSN-@%B\bgT #Gአ8%ÆVc(d ϒ1| mz<0]a]W` zgAK0_=v 8$[Pu$Q>¼dQ%Zd!R&R/5eDDLh#2ufM-q̉~9W׿vsMJZ2|enFMx58TK僼 GmC^K(Gbd*h0PѨ"Ղ+kۦM@)3hQ1,%! jC9l+qA?-'f־bLgkRӦW)E{\QcIN];u!r&%]p"+(S^O (lMȱD9GHBEpQb "zd7Q"HBd93X# Q輻KtdqvJv?v$솮dz]c?τ\+ڰ>Q6 :I _ńB ZϟZWMp:Y\vѬ;ھu{Y͝Y[௽:gFJ`7t<]rS`I5'?iyl]IYnjVb#)6'R"VHŅT\HŅT\HŅT\HŅT\HŅT\HŅTZ&dٛ,j-j-j-j-YYrp_ZKX"9VkaVkaVkaVkaVkaVbs!RgMf$eHzf$%%7ٛ-z4^~(9 M#bJ(*Ƒ̘`n_g<*z))W@"q*p{Ҿ$K*N@0"*Xǣ#Dy$sqa(p0;fVX(%>dVe/^ɶ1nU"yv{ݙsueă҂i' TZ C3e9q 6cdZJf1@7קR.NU=ki; d^6]ֈF)lӴ3ET jBULK_i)Q<.˨(JETND,DDEI:Haº4Ή 0{[,9bh,[iu^rwټV:d gzH_!g.)@?x<tÎ!)qM2ZC"))%rJVFeEdd 29#Ѹ!0Xv(* p`/[.Ct K?!-T&av{]GqiFc/?g [nM?7Fqq]K6R]0NF(f62DQ8F%Echbz,`\t[Lr+D5㾯ߞwrv<$QҺkx:Y:/ o 漷ףLӢrfI\1O(2rrV02KYN:Ftkxa9+FΎro@FobT0Ҍ@uHIB!!j;⥣:2b'%E[AOeJ.[&e#J)2qhu$,\9JZŠ U'v2p%h&Ti5O@ٶ>Fre-)"㠶ˊo- vSDFvը@ib!:P2E'r 6dBōqFDk'm""H8VP85: W0bvݜd :+#g?d۬-*K Ε\BoϷQ{^MK=b"n*'5Y8 d~هCdzOg?~xF>{gw0̳I Li;U/yr~s\%wB9j^)G޻$=?ER 0_awTw1cռ#y1 ш n[notK8J|۟V1[S(,M2<{gmcjڊ#@G>5X?ϣ_Jv@YXpLH/).xRdQt>`i 霺׎5<_ i.$,%> *=Xh p@Whr-@:) ܓn&A+7ϭZyE˷OIY|K }{>%~)FE:`X+$(H9wMƆT6!P]ߦ5e͏6J'ZHO=%hk SVyyRۓJ@H\kJ=!!QKG؀R  F8(Y]+T{=AZ2)b/׋xO׃x{9Y'\ao㲁Ab@NzEn ~vзpt3lA4u1lz1g{# _q^!wm_K`mZC vc!=!-kq={M˂^m~_doY4Г?tɚmjt`5oO7m$W!H!^ $.&Y8c(1gJ*@3A*okkj4_:MCNz&h|><-ڵQxl L/ [mV߶w+cFx\1t+>txwOx7b1 ){=*Ғ qu"!ߔӉVexeQv k޵~?{jPBP}&@{a`G%ܔGbwl߁F ]o7xmCSʉ(/ůйzY}>:J)9UKؒҜz-ƣ,.gT8X_&#k1'ůCi| 9i Vg=CB*YǃΥ F!1B4jlpM0ZF꼍`5>pE˅zd>#׸6}yb ϝ]oj{@`(u7ilw6~xOw@A~D`C28֙鱬Cm\ٮۚ0-Ba8k!J"*9ϑP#JQ&:"'/9A!9y"}9y]j0:?bweuA]4\׫󗹀MrúE>s\ZvyjRٜ}}>J30|*s??1msSvz1zS*r%]M'7)74ז>N"|4CG (ř@QL {:,YKM6𝀬 ~^ï50[9E5#6& C;ExMᡆ_[g%"ם\\65 f'Ϻ:ݳ`uBB27, /7L-,*av)FԣLjiw āyAd!k[Be ZߋqiCr+!NӭY#^&UR}[Oe}oi;S]/L`- waz=y'3ոu&bml n^ef"bpLMxgw̹Mt"ǁ0&lnh) u&:Sl64O7.PT8 ~ryEhLDc Q^9ƸR)mD*  lS<{&gЛ?"xIEF狀=ΨM{Zl>-)cSsc)M1C^*]SJI*QTY/Uj]Qs/ET.hJ[W3}ΓtKV;/]z?.VmfrrxzpsTj+pp%RrcQ#yق*-rق*-tNo}JsG*MjG*TG#u` `A牖4$Oeܛ%vP>.=J˛wBNB{mmV/fr<['{ u.N+FOjd!JFFAGkY/N!$R࡚%&(AsUb!^%2J@fVD7}:6<=@h]tktoͫqS7R RI.A09j?UJv y:yL%Rti) <1 y`NEa79!c6sG:e% DL`slg2x`OYer@\{6(P}$h!3_]?7[eޞUG"GS=u%ʈ3#SUPب+#d0l'-6`C+#Mn}k?^hc"1MP )K$E'cX፫o' >|o\:e^Dʭ E+-wJSI&8 Jk@aHG% E2>9,h)DF1.@)E/|9E4st8Tʏn$n1>9م]Jfq_"KȓƗpa-iԜRc^T+Dͤ 9)2FU,i񥱀#Pw*ՌRBr:-u 0qr H\S!Af8g>U  >PVT&I;-(ȐzўG&D%#Bؑ_CɭM bB?%eIKJv}>D=9I ز8͞_Uu#qCDxNEIN$ T@3raԏ+`D?VED1"q #fB(rC8JG$Bg|ruTh 04o$96+ "j>L*)"ȓ樅Ӏv?&z՞pq0.!K{qɪ(:EQq<ςW"$d $I#-9r$* Y 7qǪx!}Zp7p/[̯v0 hۉޖZQv!Do}. ۅJkW COpUvoઐU| Q)p•6ܼwWDe7pU/pU47WJ-*\}pe Z {WHzoઐ+UV3XԪWWA˫(˻w{~7F̹bks@"jR~t:QyhvS:a9&ZkDdZr@L:2ơ*m&Л5WOsbL!T(IM˄~ՓJpn?>/e@jl fT*ipxe ": WRuRa|q;KxAeZ$D!:tZAo;gB:EOM3x)Xbyz%>͎%*ڟۃ(z```]p%pp->=jIӯkOӯ=j$Jvƈ=<&{y\{seaLqH/gX-!aଶ 7v }.E/tiZVsf ymnЦnuZ>''h2ܿs,F#WKmHw (Mrhj/ֽVtcZλ:(1:=z{K})ũkHAl=T bk 똳զ bk 6 bk vL=9q׎bX;.֎bX;.֎bB^;.֎bX;.֎bX;.VԮlX;.֎+"֎bXqbX;.֎bX;.o87r=iǿ}Q)BIÔi09x]qM+;5*raUY. YE,D֞; 3YH) r9BIhT(PNCИ ',BZ4 j]$\I]f) U.Yl&..YٱBNUYu*&<7#|4W/فq2/W2f/HA`$EƅlUx;ޑ%.y-if2LYQU A߳b?aVfk <}@?\:ϣ֪ԁ~Vh:Ъց^\S]8e)H>C.*(d ?sxbHސXT4 ;jwx.n(4VOE!6t8fʙ}ImX{0OYF \sF?E,f rxg\ q6D$MN\% UY,OY_//󌔙k{+__qi%)06P5~ݣS|sTdp` 42^z_=9|R9P)xHLLT6 r"Lg%:SV(U 29ϪU%xG??eP9]\J-xc(wk6kV/ӂ4P>d)Iu"AKcv-iQKR9|^eFrR^$D>ok^\; @NiHCi;#gAW0t<(ӅuD,3|BYqAS3O}f9lbNri^l8hgǕU>Z2r\d]'9( Bs'Z@*I=*gG:p\*3/\Da h R(y*X>G~=t*ge'4$ ,)8-7Ik˄ gilRF)sזYRZ qhF-/ Kˈwh  i!/ 5$J8 l)Ģfl2 {:.y<<.')\cyS!vT} %e냟铷n&RG?6'~!(Z*_p}vԎGtVJ&h)SLe.=h+gG$ڈWh;胶J9}P'ע]{JBGC;ԙtoE,_eIuls<־ '55ڻj/wo!Co0xwz`GcĻ.A=-i WO [bjm;'ᕄ4CQ\) dtmހJ?kܾd 56<>^}=lbU F3);H<(sRBq/J"+0 c([y9ˢ`w[)4\j .%YCN2i-Haכ ]d&քϿ<<9ZnH>k[h$92-!y6#0@=FEZM& |& ~`L=,oP1pJG!:$@@$"9 Y$@PwcPĸ'" &['[cLd2+y;zO\Mf|id5Z%|H"C/ۋPUKeIzbp *TjoA:nH~) )4z^&sFT͵*H*G5{G5\ʥ+/M<=K?1uɞK!f @ߜf =sP9I׽nUʩKzΣM{‡2ʅͫ 3kJٕIcK)dٰErn:@0~xOD8tAoӹmF?b!Vփ %Oo*/u`ڤjBn9 mG⺐OVKSE-}H+Bhhe_bѫWG?jl[si vD7ԃe0ܘo}CTHߙR-ߠ~߻Dѽr*Bum/|{s}:ʹq1l 1gtޭF1O7/NޜҞwҌٙwB>κ }3o3*H cl1Vbf]9CκgL)jgi,!ƒ.L}3 ʽrwлC{b43gREC,z 1[Se)`UVcX;};࿮;Zn6L,EtVݬ^^^JRo7F #;u,8#]ݒc9ftӷb@ }P5 @/^J0NCs,Oߓ-d`\a2tO 0냯 S[6ɲch\{iG.x[m g_OerdC{^$\&"bur*p@M^<\U(6DlG lm9ofs+v|@>h|! 4i4y׿zpzQ`Ͽfа%N0,$H[xD)z5Cy><4OW#vD=en,#&f8c܃5ͱc.2`+0EF5hmtm9x_XǸhh"F[Qd=`t5Zo_LԧC~g&~-i=۞||!IT(((((:e$Q0QGEDLDLKgDLDLDLDLDLDLDLmDLDLDLDL&8ifĩI!ĩIĩIĩIeĩIeS85i&qj&qj&qj&qj&qj&qj&qj&qjSGĩIĩIĩIV㭌wEe~,x#o+qo _Wa砵 o©jo~O'\p(џv|._XX'ky?>MZqٰyD^i&QAjg"@`y? z<?胊) yӇqCΖ#A ^cBۆϳoW~|6-xQ2~gy<̌{|?4h<;0G vٍa}޽lܠ7v+zX7} 쿾_=(((1(%ޑ"/2tՒ!e\{l.F)K㢸D _~\ jܻh0X]ۜsqXsYw-+kT/|MlJ2z%)h#.Ur.^m4RE8Esx0|OR:x6P\-KV֪Akec.sPI E[kCQ$O<հZ-M )VnWSxfl7 YS|<߭ nzہ-6ߦjzj7NgtR aʓWnx4Xo-XN!se',Nq9.^ Os9'2=kMYfy.f%rސl1TiXb"G7Hq!8Y&u/?[#x–޻ʨX.Att}zc\%-Mx?pxb1"a#R^YB5:+h*&ԣ@Qkn ?BFx -;!.Ϸuez Gy;^}pʧxΧ}l L*`>fa|6p:PWs$h=R U9PL* LseXs#> c'ε0EmBY",>rs eHч8@械)&%x'SA\?ɡ*ٱF XoI1}Ii5̋2>$#p& 9J(0M53N0ZŠY+wo)ai"2~|XRs͓SOM79jS2tm.+srh-Vԧ OS{Mo DSC1VVhZioUgk-A+MqMS˨))X֦"@ӓcx5 ɖ2椹/-c3q[vX-l63n oGo M?Wdb (6IĭY_\\~>3[Ik,ǸrwL&z1 %Y#Yh[ZFZY:{> &CRT)\2};f]285_w+q[p9J\̣AfǾV7ڜ6!؍b9ĂniBFkT+e<6'ڹ,x(| Vq@!mM,jE!@N$TB8a/JUx,"QE$iƫyLJ| \!.Z*E|tusHZ2I:7 EW2ڙBulcDOZp5um+q[/_u@8Y>fd_(EIv_$|2f`3\BUO鍞j>xXH!fc_{!g`¶0A-5rnIsW,!_(Q#%1׈;fϓϟ.GWs Zkc lp`sh})Lݎ94=G i^{r4/L#FPUu6IƝQrYF獌xt5ŏL4b#HHnCʾd z[MLGf=B8?6g4qC'Oc@Õ{3s2F!DL<=ىǶ=;?'Q)*Ņ\Tr3 J(E)`EEy(-.[!8+M58: i|,yRw%dFaK!TңRZLj5gY@ǗӲ&Bw\ |>m_AKyNػПËOJ[L(otZ楷li3Q25x/yD78D|RgpyQoMǜZZ9/ёh(V6? &WW|7bQzWٯ'a71quRTK~\b>3$ړ帱\D ≜=iL"깎~Dw–rzH /0Lkd4< .Cϰ%>luK"sNsRB;Óʑy&qXf)+[7j|֩oWy~Fsn1u ?u0!6+E+;<ۧ.~6w$\66w nnOOs<1[W~l' \?wO>c救jooӪkGf;&6eINU=DwCw. dЬyCx7+zná:HM3LJ+Φ3i̹t&-"3iRЙ;LV\UWs"V}"ʹW\if@U؞"q9;*҂;\)9p•f>#"-gWE\WEZWEJ98#\Y@.Uڳ+W\i+KW\i|*o]77[/JtHV>X+MYgCF0N&PD-PA[g84mxG_{k+U(ß&/OY䦠ui{K{ɇ]O*6p+];"YYkωY4Np,W99%z}ȴ;JgqxIkKŨ f6?^p3\* 0i< "M*il|2)hd"=K qOZ"&&!#Z&QFYH`1P U:F,'+w@ѶKѶǔڒcj!^~j~9k|Fhe%e-ok\*@;' XfSi&O9Q1E QJo]E4Zbˏ_.JJ\ǖD$ge4Ī-in4Z4賏cn6=ųnFrOu[׳'h*LG+/G}bVi7]rdx\>t^VjGet֥tTtTR.x|Q= N68F:_vU΁gPKkR2D'R)TBFl:gE4kZE:gmԸ7^qgNn0UA~~_^wVY+@Q=}B~Xg BiĕuO 'yHIiMN 3N pZxʜhƶJCg~vv:D W:TZN{+ǔSJ W!4I :@*2*s{V11 =)`z]&g1EJb.FjFz\VӌCj]ـ'µ2kŒ7[geYz<}w}=8bGyR Qˉo/?I@CSW9 rDlnOLaDF s3=P:(r$e1c3G*.3T8Oq 6TFmP{`4>$2qȮ€ڡiB-!h H@D҄51xↄ0^ Hȩ&!8u o 0ǡ+#"q@m/=Hʻ*Qe%$xD"LOn%O;UƔF$gfAD<I%Z$yp0xʈXM=_&zԞpq,)ܧjZr(.ʸ(\pqgAFE YF~d 7y Y>2;G\9$0)p/xXM;Cx*]8c7U|[JڒsnNP?js;oBdʴ$a4hQʾyL:JLAь{J|᩻fTR=w6&5g6R@r$vd@En,И 1NE*NP\+\I]f)T.Ա8#a1)`vCmGݳvܲ{p~g_f/Mfo>0NX}%-TL ї[#"䲭a,-A6Dky JxoނO]M !auȎ| *_;RXZ'߼\].\YCGw~ӷ8QpD)M&#~99, 5LZ&eb"%B#4(2M 2A$惖:-,$lUQ,&`TȘcrYDmr'gD 8p2&Ξi^SBCN߿GfA?e9 ASwS׋v{rhIE+DhefĠ {r"CC^,Fonxc1=+nȆF.#vf'1P-ݾp^)kMcwytx+W]]d zé<4c^ww໿Jk +eoDŽ\mM$xbxכ ̣%K,|z8b849d l +) x,:K %ls!!%p! ºYҗčd Q^NcBkZi&rj 5~#gPma~iz;@/_^Z/ %| ܎WajT'2I}clƆ.A76y8 ddsa/ų>g8[O8\_i=8W{Y^4*rR5%gC)&EO98[ߞ2#}2_۾1UI+*̚ʊ.O=,AN_2 mQ1(l9fT9 YJ oi*"7]NE;5aW|L:J$)MhAJ1"l 2Cx]`~3t>FͧΈɇ= &tC66KTNĻ/xh"GVdBL U%N Ұxd:ly0l(]0&6haJ\9q"ڑ$8.[ uOK=_9d)g {aJ@F%*;,*@|Y`uZl}G P.Ej{vW~(ėjx1nзS]j[w=ᵱ,uɺ3c>yyOO l`)raPP L:=x;_܌h^H/O,B@zY]9WTo@ԥ-s6ܠH~jҥpٺS Б"D 5CjE >5M>`1  }% 5Ҥs *%$X'zv)޹fͽ >j|̸Oz KنIՌL\ kTˠ!c Pn S3K/}iUM7j}|*cU,GRI6gz|.c{B#g &J-ȔY>Р2%#s.9i7w.};aIi,3jI%ө T0&2  !11Jr 1xH#~Z~cxmZ;Zto|?~鏣J}ĈW о6Nn6Q?BK[.'k6$(c5hul֬aG '}؟ d.6/l |)RDΆ-y\0CԒC^ew-mHض( L2؝f 6(:ȒGq߷zXv$Ki[N:pdUbpI{A-J. 1'II@@c`nzH[sLK?>4['Z &Ң>]ܒ}4m}ծTv tH.hTIі~)ڣP5',8Axiͥ$\jٱBzYgfhI#XdQ)(?@G"Ӵ\$R,{i$gRqEc:FfE4V뀒qEj(g\} lw d KQ+DNܦc\R9$ERkIR1(Njz=OHBZG`s:KXm-*@̞IYBIV5CbIO{w?FZp*DnVf}Y[Aڒ+Z$ - DCi]IȋJARph|BYXR(.0V&)PYC(IzH=gAwWyl8 >Ƈ`x)&@2S#42Di'xVـ2T԰y|k۱cr鱖j!w]Ęs6́D) MR#$OAp!jŒ TA=Otp&8=ţmH Br JnkOyV7ngQXŸ;eζ ͜-ˆwdϕ|D'r=Hy >~,qxa&}ez2I`j&y <]9Y<`rnH@qO1L2Oh2%5xz?fx+GaYv:(\A/M4 M$5rt2 Sx?e̷d>J4/?E/^OOώW?j\VRw(ruxbbc֊j̻޵7_ݛ]^܌g۬M|aUpM{I̩?̃ɇJw5+"bOEO^}p=0'CjycOt0Ye$ VGQ,+h8\ iz)Wnuz)p&%.$t}4 i+c~{Y/=p{C5ggͽ?@_W?y]}?osa߿W߾yIhia8Dp&o"=~ӆS#qY]_I,D/{sD Rd (:dziq QNw:;۟ŀH ;̅15b;keWnK ԪKb?/%$/V:[5 xؤ7xcD \:KyN&J1t͋~cHUDZtc{,cy$DybV\G2!A&@@捳\{@Z;2ncAW㉯$xgt㥗Y.ibaUaZLJ! q?NɉM2h0.FJN7Χܔ>0up|6!G;ctȎCv/=BZv;e]l Kg{=ЧTCrd# {%\&"lur*pC@6B\\u(cvusrllqt>H+Q{S,"$lQwnV ҽ7MS??G оίp;ͯ wǒ?f mzW԰r/UborŌ%|[ XpfmŶCZO,>Z֭M M?q!t Pb%hw郲AR}m/7_;ۈ?y9,\R0{nXԢ_׋ϳڻowZ{28?,&j;kx8)A?Yn^ %2({ԳS'Õ1XօuaFft'#PW2H-K' R.$MH8^Ed}M)Dn[j,9OfbVIt21wOcN:H'RHDhA QEC$K3 h xՁ#]B&X0o"L^ .,†TObJ=*F\ܘ}rU5'>TF9"sMPR"D0WKzsY5,Fbhm&LK$$Ι+2RU@+52g9%}kwkt힇q}Cm 7͇mwPZxi-&g}Mh'|Da|h6 p6^"!{{@^r ] jfP(H3qϕbIM,d i43Lr!OkaũWn5Q,,hCL )@.2Kə hrw; BWnFΎEnU_Ao}ߴRiSE")-uyrC2X4X%$TgUB< ׍f(5Hd]D/%$*yDRFH%JfcZlT<KqXǗ.D5>٫uTr=c8_Jj1}'#ayۖ'cs4Yi.Yj5$+V8dZKy+mqt$\SU3jkrJ9V<ڔ%z0@&eDlD#&͵E$}&]3*ta5KPb5uy.|(] $[7zI FK{d @1)Q:- Ld?)@# 茒,N% ƖVfa(ƞOEmJm΁K&mǬD\f*kjla4L+& Zܱ敵6v`o=M!f2K2\X\)9)E Ua&cYȄ 9%5 Y%"RIF$dT@9akԗ~9)<FjDQY#N#v6Wq]0)u<(CT<$.$dbHZu"3 mE2虐B57mdI 0/RTҋf.\%EYY/N/vz6ϒpN>hrF>* qboHTT{ B ^|x(w>ܳeƉ .'SQU`I9^iiP!3-7Cc M73a s=Vb؉U%VF&2 )ãXD9:ВgBH*t1%0YI0 R))gSP壨 :N!mv~ZOYQwT24lQR9m3y}zXQsV*wmF]`K@f;LzՖ0M\K%Gb\߀o;0owxN{\"V $<^5nkM)R`Iv׼>5;hy=rŞ#iyk^hAj Dj#2"L $㤯YF獌gxtUŏL4b#H %!2'AIDnӑYOUƏ98ؐOfA8vqzNiݟrs ̓S㯦6KYiNQ.`V$(D9Eǁ,''4 ˵s{ .$l6*GV;F9%x#zr<yKٛ\+6m]YcYJ!o z3%T J B^BP +KĴ|4G^Wpti_rD’ 0mT"IQ~]ubp0,b[:'H7It /xVc6YQ]>9u'WGſljg잮] ]I;-5m.~"b.&AԺKfzwɞNmv;^{5eC-yvyxsc倞oW>՜2@y{:ZB/Yubn-Jnڼ1L:W淖[J|W"0T T>53OғOJR*d1A)E kX2rzU|6+1۬D9.}p,6#i@xjQf%XVh1`DTJ n ) P>dTS1=mf&1d'KE)10IqRk#9W}\NLikT^=-Xڏ )[=(-M[jۙSˣn;/7wT6eGĉؐNFmLE*9Adz<5j? D!Ǭ33ڇhI#XdQ)(@G"Ӵ]$&,{i$g%pA8aǤQR2fh%NVj(g\Ĭe.F/!Kֶ3"'nSΊL) rH0:fc`QnQsE}~eHBZG`s:KXm-*@̾b %I羃ZՠVMg'.~;Z.VQ :[%]-ʬ/{+H !ih<"2Q.yNqTQ^]MZ<]Z$ŻQL5 ga ň>2*AlĄF:6-\`mWclJNzGS(~+8L.}T0޿fe>eԒu8lyޚ9Y<`Ӥ|́b3.8d'gFKe< ^ɔ \3n5C[9}d e2Wg_ pZ\Y_G>aoZ_7޻*kuѧOӋj\qE)ڝKœxH#p.'6_h8/T_WP0_zwf4oݛ5/nFumW>gάxA|ҭVmUQó~\mɐZ:#]lFlan/$ VG U,;h8XjEkrnUݣ.&m-sUL[ gRnBBϣq,]_MH{^ͮb);:ub/_~,?}_ra~_u@+0H6{2 {@[N\okeRݷoU/~aԟĒl=('!+ENZ C!ʉ.f^wuAlUX,p0滘ZYs[?^VXB?ӸÏrotKlV %xJ`1" .eYԙ@'w%M}vFEK?1$*RFvK{,c g 9YqɄI7r9MGYwS`b;~ΛWZt\ڀ׉aZhroy3\[ɽ]9vR.̽ riv!8Y؞DNߒ5/e`\f24Prs;rS*}L&r?;"YndRXZU:y=-Tr\b y^Zydu!gIVY2\&"lur*pCv6l,z{$"(ZFܿco [J^/Anq؛RoH'l^i[-H]Kٻpt3b7Awܠpw\7,Z>P 6:릂]YKe[۞6;~oaJij!epؤZ70 +!JX,4ѰLYAJQ:]Tc&o)IR(IJRd\!kEIؘEц,:Dh/q\&]r ։^io4ߴKZ a8z=* WY, ¶m&qL_aG q3F1A̽N9R%B|;< f*yj9ӄBi,WUk+u;tMI`7ơl`ÝC;`83|$}/6|$7썭A}p!@(G76}+GEQoj$fgT{$d$ &T:g3C^Gm4RwY4Bc+%) %$Gm12V!FCڂ~Cc$9{QW1TG"Pr kHBpJ/%7+Jʊ90i>4=-A !x~XfQnso)Ҿ"z1 -eO8yGrp%A8Z?1JGaYBdS3Iq/EI,м~^ޙI 7]0w .;u-8R:|X|9_Vg7Yɷ;WVr$k3&!9@K8g, AA$*8qrdkVrU83!B].f^ia$↔4 c=J1s3NUQzc>k*y5xT!O@aqcvZ3ʅ{\sKs%Lx>:x둳F,Ɠs $̊) iF݀bG޵$_}[ 0.3OE#2iD&JIJ˒*Pt3%atd9 ViD㥰IyOڜ$xUUb5~]6(TkTwt](袟PYI PVY,ho{Yɂ/+Jj Uݩ[J*h5 J{+#]jZK*pUgZ J+{zteūGWVרt5Ew,~;te#C`x(tuW}7B L^C -SW2=]{1!*Qt \ ]Va骠ԢwHWiA-vG]Jv s[tQ3֫HWCtUWUA[ WWP!*%v UA{Qڞ#]i +9,_'rF?c ܐ ~:Bre8| ӋzL8pdƋ29MIsC|JhypW:mZW#=b(m)~VN;1{4&+ RBĀ6p^8/iqsNi0n1O- ǘ_zE{,S㯿AF:h`n00xm> :2_q.p^ϬBUu171pD}ҝ+y=˝D˛l'MAbc&#R¼zMl3 3MA{􈦠(Iw `;CW.vfytUPt~J#v `ӝ- BW-~5=]G2 ;`+:]ж?ױl>ݞ^JXӥl,Uw5+tEhUAwIW%rz> _87]#,k&q7x;7 ]?}Z: V5ѠceU`-ǿ,k`9:8: \ȁl5z::|`3 cLQ$f=!FNIpCOӋ8-3]76y8\v>ǝ@V&GJt3a e<ݢpJ>gj6NmG73t ,8Z|\fe~WW+o%\gѷD]{v5ͶRaVdO0R Fb,?t5(ʽJe#~E'a6J+υc堫phl) h0wsMD qc.%Nd[~vQ}8wNh,,|୍(bVz=;oÛol.pmmH & &4$>iZSӿ<9l7ٽp0P`5Ea}dyj M't`1{Œy|fYxiP+Շg7jC5i}meeT\gT$n5T )uAgPKkR2D'R)TBF|:gE4h()("^g6ZP\1sgғ9OkyHbzryr-Jo~VUO^]HIW ,j_k\I w;l?3qC]Aq7nƺd'Cv)-im_A2qlWAB$rH,imӁĉ1x.SIch-KYR =Tm/Sr))O4`SAb: Mp;+^iUbo+et[M3>GF3ǃL #6頍JgJdetYeyd):jB-gVyb̲ If#{ 廘%De4Ovl5$K})bN{[zV7NO)SIgostXܝlr2D+q)v4ml?^?OrVn:D W:EUJt[Q8RiNZQ5UQc޳[Nԓ F꘭ѥhr&)*Pt<63Viz}RkYQ\HB|= ʹMe|5`ۧxş hv0qƎ1LA *#%ig/?I@CSW9 D2U[.C \Lx/)>IA٨3v&eF2cWhWB(Zq,kCeֆ{C!SXQziH)C%fAY[3R7C^3$!2(:kbbA 8~A&,PPM*a58pԏALˆXF36bdL]RqGivHn4rc (fqHYn-]:]<$AWkK7x4E$K*ۺ8b849d l Cʥ x,:K %lu!!%p! ºY묭-y#B)Ƙq4vf`58sb C"(qS6AWdֳ×⣗JͪK}*6i*܋sͧS[܂YleS67{kpů(1<H{彴9h!KmU4)kNh)5dsMNJmڃsd=҅-cǻbwӄLz¸!(W12-觤c<&@p)3t%Cw`=\o^-.ۘ|#`Lk;ec3">sYQh1V!ɝ|DkM'y)RnXҊp 뉧'73[c k}Fѐd8'Z*dNvI0z! QT;YjxqSЁֆ6zkc3!X@uYGg j|J3x(NS +55TϠ##|w̶۶p0B-fLxsfj"ю]/OGWu3]7|!ͷ-Nm@{mVcr2s%o<3"n=7-8v9R@_X%IAj= (#0` *JP @ZE> R- ޓ%Vd,R# jCbw^vYJ6k1xe7:dH .6j[AZB6Q&"qPe.[BxHэIt$HYgHEg t2c̄NB#[L=#"Lzވ_P7WL·Huŋc!PT&11}eBu`bpL ţ?Xe{C8d_=KjXS<>CvȲ7|HH8&P +o08W<ߞPJ\Sir BTt1嵤G]g>1͗!W؊wO"BV[ZYS==*ȵkŤgH$⻀ 01@ 6d3)tAA/rv<ڰoPë ?v;3k6 id0]J.Pg#9WH!Դ-`Đo&_OrC3ˇmNٺZ畃"Bw쌲DPdA`}a8kvqǖq?҈fY!8.(t2XFvNcRߓ9,8FKjiJ2+=sfLN=H]S,D`!GlbeQBxg"[Q}bU)"d`18 DYFFhyY:@PhHR[w'ٹ.}0%_˘K"x5YE9ZD>`GA&ĺd2aL*˘?(tٟ?|tY\iO%0>yR*<-;Y#`Z~݁J :'ZtS:⻷"0oP$!`Ѹ$* vInm!htUɗҾ_1SOt3*7oN..OWՄt;[w$2o}݃AMkglBhJ|Gfkۆ1c{Vb,c%1᨜?VVە] 1~|Y-F!ɉYQwN]QݷyUV0h(^ՒG7sx1pwUw]U[&\(Zc1&\Zs?;?XkpN gir6)G;mL1;dt)[7)-҃st2SlMKg{=ѧFcK`A$,aC(YxA(E krcH֡e nK`ynu<Ҡ& _h4q@~a 'O"Z?{wٻheԁh* aq;Ww|g{#B^=w|snj%|[ Xpж !-}Jմ*ZWTjTul_Pz2pٷbxyU ⴉiȜfg"$K{Q+$##4n7bn=UX=xI"%J2#;6FLF"RAƜ)k>D)*rV65HIHYcBi  Ԋ$Q}* ?@7rGkC gG~7Q҆S^'޿_7Iͯ)#=ԴChv;ƘWJ 0 ڮRQЫ@AQ}6̤B_Pde|&m2Dc\{B(C) | }e٧uՂՆWψ[VSS7ݰ}vH\D[:Ri,Y}7e.{A/]Y&j!$-z|qP%]i˞=aI=)z!{?DL25G 1!pF"F䲫c"Dkw>+Ѐ %ɛQ2FҤ`D5b2I6%([]zDCB{^0LSS Iƙ poS&l uػfaWA^dѼ;/toϓ@,?Ь/onPX)j#}L0/7'jw?ZTγ= m?*}A &%a}dW H =C.Z7(J"s-.;GinVlwP읨|{o^ȿEِ^F%6is}<cu>˦x5 KD%9z־i)I`rCt: Y)+~Tfl@1:+@EKb W(zMga!\?FΎM&SNo7#|d}z9~c=Sbn^뫳٘tlryd@TLN ʨJOATEjuyJIQz${c|[iC2T} IIBQ:ImT}Vol8x|Ux-ZȖӯ)Zr_q,_ȳ4y{O5k]1/WG9|vh]蛿o弃+&V vTZ1*ľjb+mD~\)73n}t:C#.79шc~@ж;t!=?[sbo4JmOZvZdYS&2F9B(!1egSJnpHJFc1rXIWBNמ=29ۣ~URK)/2o.v}^M{!#Xb9%8H@ JZYEIPE!X+9Jnok|{|S-و55k%&$11?{F!_vmQ|L bl`i %$=~Òee8VSd5YUUX%X`{H&E̒dbPdH5& a:*e޳tcTZJ9bl)gWN? FLJRNYϔf,CJow&@ ) pT/չ`lE\*-3?. 0mR6(΁ZGA% DPj[U jq_vî yɨ5ɤ7,*"jWR,"<"6{7F] &y%#p/Xt"g@&TGhDdV#j9'.P*|͒mc֦w\&BԶ&*ihBu Q93V<ʉ^A:qȮ! Kzkj&Ud @"ظg3Z&Eְ/L TZQlFo(]u]~k?Aɋ"'DS?@9dIι|NсEA5$p]D:υFGQ#e GAGr. pCGhh HFݜ,~qò,'\A#D@3̠qZ<zF C&M`9 Tw9W,mI.gJL (T]GS 7Z1-  %@?APy*x8 ηAwX6#܋(WTqub+L] Y += вۭ驶mGj;!R$&B(IF&(yqwl+Heiߧpo#q A:2:"0[ɢv!P1z'bK']:[ D7?ss<:?.0q0ž=/fiwը0.9M`Qх?f/̨ɟ7w.')ϫQ:8os2{FT"=E%vtANrdGxz7G13짖$Z@J8j=!*D鸞\FA 78!]2-y(Ǚ|J}`C Ww=t.lJ3nbɷoo+"M?LvE ܝs#M#pC/q8-$OnT7uhں3i݌kuSlEp{N̹OdfG2Zˬl>IZn ΛaÀjfU: g0blGcvϻM/'t㨌luM6 B.(Qg> }r+uG`LЩDǤngȎ_O>~'ۧPO>ׇ'q15̣IQyw?7 }wMWo/x?uG>G$VOq|Faf_Aʑ'cv8!Ԅ,&kb-*S6b\Krø[7[V!n0/1mKo;/ 6@YXpLH/B 7)}(n7jh^KsEk(=a]L< i.A",%> *=Xh p@h y\ଓ=ivT p1x%1xΑKkpU,1L@9U ]{)Ѽf"_qG0D@wI✻ rnncCr<S6-cJ6Ҳ,=8Fg-#ͿKN˳tأ}J@H\kJ=!h!QKG؀R  F8(Y]#T{Ơ\C,vne/:N@Ɓ:gcA^b @:o$Z7-dm\M}_a3=̯ǒ׶;^> yMeS·9-ɺSBsa]*ywm3!-N 7{CZ™݋qQe{8/g VP‚G`ŴTcI80-*v on F](e.G23H(pyA>jDKr2 EP\8scc I2&M$RQ9"<.,^PZ*˥6x5Bμآ.l#='/=]*o7<0Z>?:$#߿ޤ89rHdt`oY-iE<8<|B-h쓍d4xD{zRKnR@Ւ~%3*PhT8[_+/EOPzd >ZLSZ!િPu^#R&QpLZ#ՁHO9[FpM)g뺆|mxZn}mR|^ K[\ V'Dڈ,pU I"2)Hrg҃D3V#E#d^>-be>P/uP* (1zBP"L 6&ֲR#I 4P -$3ə6sT9* YV ȹ_1]pX;%^28aw{uMfrY^>*JvWPb#A[) @Cqr&j?>Dx4 &9ۯQhBVQ<,Qi"t5!j9:h EO 15ʢ$l2J[v]w?0,J(,F+T{vdž29`l46`5 .:Jb%\D{[iF~<91ƃ:h>nHRXG,#[LB mWm]yM=]{=zlʦnnJHbC]T]$qM}S`*I3@0ԋ. i8φғ6e+- 6^x9#a/ ڛqd~7(qv02M<1-^}&ʁcTBlTī"X+"*ufl8e-}SOVpX, H`B @l >BX=BV}frjU;'LNY#̙>y1p}ٽc  J8:Z{\w4/Aj<I&(ZUEV'2J@unMmAR!}mf&b/"mW-m/cJ7l-z&H(jo UhЋ38UInCۃ0i.O `*"hai) <1 y`NEbׁ墫p0:z.|"u%1wmqHW?M/P m桱Lcٚ֨4*ٽ E[VJeY%%Ō̌8< "(&,V=dMޙ'L5q^^SLs-l@v|vXfy{{#牜;|Iiu4M\%_IvMJS;̗[O%շrB,fe*Zs*q%Ln,p36\䉾D r )w􁼤\ҶhA&`4V-xG―. dҮn0mJRVB)9"**_ \#Fr~sgGtg˭?`S>x$hfKwu ,%!#iQi +U"*7 ZUՐg S."%"贱{6GOøzP˲JJrշY_~Kf7=9"dWߖUuۏo{ms.Y?y9Ӳ@b.?]QyNg.eU&dɅ`] gELdЈS`>x_.S^ɻsQt}@OI-\Ud, 7*q~ 4 ,zb|d_3b:|k)?<"C3>?~~yi#V%YDDxP`2Y'+h*N|*T755TМ=Z 6 Drr)P0WpΈM݈9K;Em0`fӗmIb׈j%5:0xEz\QGŦ=ؘ:iɡh:㢙pqm_ɤuX1 …@[JxKŬQ~}߅L]^!4@_)`KVxZ@rGkΖcĊȌMPM1R6 j]R|2Z5Z& [RR W]0uao ()\ULg[=/֑c)t`>@zBz3n{ Nn:u W;n+ryt${W<58Mb.̲C\K ]U)M 2gJHEZ'` 5b\qb$'h!?bXD(6å*d}VCIDr2Ux["Vh"KNFWF{8{t 9N8 iF#&>6>VOаCFlYDrt>:p=_gx={Qe-3R.ZB/`)i#a$Pz5O_] Bτ<(*!&L xC2 8*QZ8\Ly]tӽP=7Fz!YUjD1'";f2g,_8L)%QOV>ޥ6/A%ԓHQo,P!; x=r`_ Ɵ㇔(1*acK c0"ѸK%j4?-Zlܣ|W)ԯ')ˬ uzԉYƥ\̿r_׋;Ck'{{® l<2Uzӻ}7ݠM~,}wZHY bV =`M'rܻ!tCu7~zq/ݭ!=9]otyvwd5m̝<* d(5sѪg95SYDnWDCG|nlxT.<*۶PLUEoM)jWdBrgK8 ǔRQ)]""-!HSB,>N7$0"33&nB0tLArV` &0y̷Օr眢ӷC mr||(LWB)P!RY#/⢅]5 P5@ c c'f+͂';;uIR*G6 }QZVoH))\l橳l6b D xtQ:h_ PJZW_D_?κ-63[-c[`LykkJL AV8$l*Q_se2&{5Po0XĬ'F:.ƈ/VՀ*+cHb"xZݨV>Oz[6e)Q6:^Sք "RmmE[H0Rκ?Gn:qn(n׎ݍAtBR*QA.Ѷ\QBgB 0ۘ:[;""ۗ2kc1n*rƷ uNj;[0rӹx>f\'4ʂIrp%ḌyܐY_~utӐC:[M/&>?]=?H̖CnRx%h')䍂MB޺&(Vo^ҮJKz{'JiqTBn/;?уJmRZ:_Y!)^[ l1̘AA!Ph|%ܭ*\?}[t)bÏmfqտ7]6G]U,0+`z_<ד{d۰_tAQw-mH4f;m0N2Mv0Iv; >mMdɑd;bG[d9 8Y$"UL J1u}{ ^.F+a[iC(Xe-,IfK0cq-tH˝V:gIBgPEӸw5>pS-}ťdKг/FWahRW8`uN]Q4J _`8%$N'T&?wZ4|_O"jɛ˫ir6C{,̅vY'=;Qy{svrn:m $ܚ]żjj[ʹiC[a`n};Ekq\ͭŵ99m刋)Xha4CRݞKE_b!.u(J*@'>օ:µ.AO^y~y_?9})&/'߼U'Ap0E}-ZCU9U?SI'[YY-7jT)T_.T'v`H}$ݻ,Z !*W@/:*`,9C5hPo'W7Q~w+Q V2`r.x\6 ٠74Ro)> H imcqpD-NۼAo(y Na{Ci;#tɴWp^ 7ȱp`B:m,LY:KݒVhzG}rP1T)b+BG jRz\TkndscOMf*r]硘8ȺUWoCK@Eyx/-7"Pâm yC_N3$g+,V+\ƴwlym;7Wb^3f|i{3fXZ=nsՌbX=z"E1qފ iX4m`׬(0Ċ(Eb![FJY+ZɽYE)sTv,r;"'`YLc`8dz< Hi: (Y(Pʍ"VFudXDc -rbrX+"D4HMPQhR2zjn6|q3R'J=-lDi)F~">OD}HmVGڕ b&ix`,(U2xi2)]bgAςm 5s#"|38x0S^o|9Dybqk \WiNxE23:[a٭S"s-]̟״˲vLUcO!20".UdPJƤ^ [1w:FŠm0FŠm䧈"ĵ 8Z+D 2H/;[|SDbVGCJ9ō8iU"bKm=3^CJHBs4fi@k\%f5s9mb']uӺr|m _ &=:wolJػ$fWJ,0?O!Vce;ǦoHFRHJ1&N4 !;8οSAwv: *zI` Iũ039Cq"r$720y&)(ADQ;7@JcPNbXXH~izWo- nw*c6=Y)]V~qyN z9ŜHno,AQjnJDx8ʅdlv#՞]،WVT^bHQPDH\H93j&2H!(Bo*k &+2 %ZvR>RLzK.dGm7AԤ)i"ȘMȘdlQB9p\ 7cRW~Xm.5ڿٝ_Pv{nmߕqˈe8yp '"ckz8X.H1P0hJX_TȠg< $Sq{pݞO#CWj/Sʋw+Zpݮ%0ad|[2~}mV]h*pw$dX]ͳg_gq_x > 6 ٬:|wwnetM>JUa&:.U~Q.x hGjq3u5Cɻ_SOUNYewg/Ghcq۽Eu[)ojWzwg 9iyfqTHCKK)FDoeﳽu%bx))`DQXn 1Ys$Ie\Yn&׻}]<(G> v;6z勭Ǫ"mSaeF:ra}aXVi)Ĵ 4,AIpB2y0w [S5ʒ{0V;,NvD1R?qF g$?ýϽ NM^A#B<.S6aa,A` U[?J]6r+uEY6q+  eD I"i"Ak8 $Hc E9" be}0z)#"ba$EV  3l4ڳƕ`~ƭ0r]ZC!6jY=t7̈́Lm,@ yچȽQ 'TИa1X@.噋ƀF!Ŭ3i F  62RpgA9b0XJBvi&L0&9u̦! $@bkFIE+r"OPUqv IiP!~,VЭkSh8Wb&O+cRG4yzpjnwP:szE7s6T <>Jp7W]nj21fb&eJrP)>|/gI3U{o}wȀT1&M5T |.M/%zGhXf Fqi!=_~RX5*#i2slѳIS fo8(.sfi>KFpڤ N[h fE~0gH?SlH z"TK 6g]޵6r$'; ,\ _b[-)TPcD6%J,8GUWUWׯߛt 6g1l7^nH+yk(Bv!QJ*X]epjp> \}vy+čd%?e̮*Ǟlm7 Sw wz&vw8;MXd`<ϒ_\³)vƴ>?4gT;ss|3jAuUR_-*8ثJn&:),Mqr u'{tVa#,ٷR캳~^͟g m+5cQӞ.=8~8*0Fyd?D *H4w.PK4{-@[ɠ}x}0hX看>ݎ_ 4Zs}pz'i6nߝLa.0ı{eG{_Xkq(sVP~e|\[YNJ#M*X AYEBi Ff$IgjTq`w\:d'˕ěJw}w}^y/;:jsgB\'Pto>E.םDrG51ٍG1 ⹍DZ9V(n顫R3%e#qmGџ~d0KWDz9mWvP]^Q7AIɹ_`^V\0ًuj>iEOw?7Ɨv^_nݧ4iTko܌uK|g2:p\$Yu6U!ѝ^Mve hbEp&RW:Nyc)$,F%LjFT4qEH#` p* $&oڏL$=h%B2E_rft mф3bt<83﷯HܹO/'Ûݮ;[an(FhЇ^π~o[OiOo>h"x!Fa\PA=MhM'\h2QS $`o=OZXq\o5EdA{SKfQ'cg^\i ݳ+_=&> s_{Yj)Y΀}ۘPa [ϭYq8G-Dm6{2Cy4kز–UZ4(5NI3@rzzz󧛃:w}ؾij!p;K˱Xl[ϛn5WZn{zwOϴC<7&vARy:7K#q|iGVN'%c?c=Q , 'd4 =Ub ԒBL 뙱1$SIFOsi>:*K3?G ]-iWr 7ڼ7/e=K0|h|&@ZXeu߿_\#U46/nU^(ETWۊ$M.{pl%_BĹKwMfi&-L-cIvޟ*+?ݷhmxi׼%eϏRhhL*}4cߓEcZwH]j? /mn:˪fA D] ٢@G? : )#Cb19'&y' :&?1%H M.n|\ 滟S\WvON/? K:}ey1y{ NF-PW)$)Ę,s =`F8FkYƽb3(g(AJZj -JKA0p٘U okt}:Uқ)ZC6,[Mb{NzrX)bj HI(N.pcR!=$*IƀMzSF'WW=kUOy^!2NzCGuPL7>Q(nP.xΝMzSDdsڙ Z8NYsx&YĔ1NF^i/- A&_ޣh/Ť ja]6-lWᐍmL=(* v׬3Y^*ij7ݿWWe|\nwDCZ2GŪGs\?K!8Du(\R_)?CVҴ:/eyziӉg-槏^;B̠WF6Y&O;W]ȳn4#_ .G }N{Ctyыw$&nk!Wkrj!$b`63Fj<>O+4;@u9O8q ~;%#rEF Ƨ/q2=m)$hAL sZ:l˩ӂlza6=v{c9y+yՅZI~7Ӑ4ЍϜٕ]d3-;VlJ1;Z.<,Md9f4l$~ͷ֖l]wBZtZ8.Gcnw*s̊GS )wLS??0 M}bK͜օKDrvfix`"VKkSİM& Ŭݧe {۱֜ģNHy+V(ۘC+7. t8щhFӐBd.D#uBSpl $9w&IK%>^,tzdI%}|(k GûۜgB䜊h,(B*aAWR@~ څG\ꞣA=,~WzjPPyWv_S28Ǔ:R9 K6'0?8*WM>_u¡ȱ_Ɠ۽F~p6F116'q?v9q+a؋KN]xrQs& CsܻQErVoK,|G]wM5dC/քl6M ?o;<y1,O6UF9Cmh f3.φ 5:\]OFә}/|Hd k`{%[ p5nדd^_Osx2T Ww@aI|% 9YqAP6Za֗9\!1ّA䱔ޑUJqv[Sk8VĘ΋)Ձ(')%%FL1"*_wÐPȣ T RqBe" NpgVͣbx=6Rp$c Qe)^rIѕ,&*O1D.QNsA7.yoѯ2V%2E >\ЮwTq85TG a{D'RȈWcr+y߆l<,ujV=U@!(Lh mH)ex@GTEܼ(z?Nd@km eb9Qd8$ D*fjcZ~;2e`K h*Ai5Khʙ@&aHh^qQGu\VlM㲅 9]5pSj4ɼq%#g^DFP1c󅑈xDy'zz,Wl*|S]xqi%"w;&!j M|Y FE#3d/caU~|y{,Ql*tȷۋӦLCBؤ,*9|ڐ'>yM( MWu$/~-L 9@|6WЛl0= L:8bɒeʖbe]X }W)=,++7UT?pS:frw.Po޽w~/?e={|=0u"0d$w@MNp'&yHD*X3-u97t9.C",%> *=Xh p@h2;iFI)sP9te{=6gWai7)0 #Ng$~!CZ~̻ϸRt3V&s wE;W{әh{^^jvPomHHw?n"isճ(I7_> G+_BlXWiZwmʘ= T~e'[Ti&l ۦ^8 7Eθ7 C%P/.6p{nִ I8M)&U%W8t u>ꌮb>(1C4œgh=ty7Rӈ&8n>Jds'ADι616&c)xgz CL}t.fJo6e\풊C6핊4.wx%o/ËY5Z!OmZL$с>ipTȬnm 4K'ժ<r;wQZN>qEwQ'62u,;uC?_PW}Նj '}vKAu/޹SHp)?zaRj\xq[u[жm PZp<RRWx^5 DJX;YrG;'Lȅ NVk!. %&#,Ce.e 빱1$SI JK& NU iIxM-VT\mAk(5wh/;R"zCLk0*0hĈ2{,XstI902^ ESL1ĮָѫDKIH 8[@ĉ]>$d#?֜DžwqçB>}p#MQ=pgkP2`ES UMH)C\<4UDP%wMb!%1J@fVD f\N5U?$XYBTFO9t8Smx|R3FnpɃ0y( `*"LH!5ip!`s*ꊩ{_++HǀlsJJM$b*i "锨ăRpƋPqVsz_k7Ʈ_>hc['Or1f i--V ^q*%gšu/N~46L5ߝ"*)tUب#!+/~{{(DdhBQ>S| z}2*jv*C" J9U1HLc]9C"e`T4MnpR9& gKM7-\>&~ ռ~ )lAH.V2/"p+-83SI&8 JkQZ4h IG4ͿRR@(t"8xKya4sr;E}u{( d'eTto}5%дONۃ2kYUhګo'.j7f5*h椐V9ЖHpT|i,;PG\S2JR9#R8ehz)d*$X1P$)iҖ8-c9Rӌl!%RWml\l댌OW4؆7רݎΟ}Tg$ ^& 8' }*p2EnF eKsdtl`;8fK%6LP };Mb`"%4K)q6[l7`l)jVV]eO:N˸y H\S!Af8g^T}(+j$CFdHhϣB["EQb.JD'xtqB8aX۳a<XL?vEd"VpQsBe>F8Fi-\H@&u|tu Dkee#i"o:&IEyPR3.8!(Q{4ŭs[#ilo."C8jbŴdW E^brr52X'@"%GS=PB=GGWQ$k #)Tx(vj=ܱ:L؜!mq8^khQliaQI[$~nOO܊/&B}ٿ/ (ѣĴ3joϷ$]rVtx̞2c^x@ʵLڅhqÁDVHu mm"%R ̐C;0)KExV96zB^Ե.TA҇*&hȅ'A3D1A\ gRX:rT@ dZJhrڞg,ZE_7Y.{̺ Q/E2om=TE\?GT  fEWcYU_"KiDY kVprWU宲UrWY*] nze I;#ny>Oh'NEwKWbIV澊+s_e}2U-A=Ņ"s(C] gB$h_pxC:m)nzl8+ 9=;˹gg9,瞝-MoAN f@D,)pzp Kx S,%cY4[ 7wyoCw LlqLcحEGqf5Bnxr;P&+cMgױ:V^Xy+cuױuױ:V^Xy+cujW^Xy+cױ:V^Xbuױ:V^XyhQJнZӕ6_p c&Rh"uqfI S[K$ȧ~Z))q<^(Apg*)yRY5z4FMс4Ӑ"8pU: &svv;QN΍gz^7_>7_}T[nsSx"$\_Hh( Ma mp 2GT\â[dN۷0wѧ>M{nM[?U.%7/M]_1 ^`xz`|Q`\AJtM*:F!Gc^<s0Ǽv챽Q}1*I$p-HX.LNsJ#q^ gƛqo/,N O8f82_ B|޵#"fW#,]LaO[YHv2OÒrۖTȮ*֏,VՃ344+w}u>o*wNC+1Hl}2Z L3dg X#6΁N58X ie:Bg> &p9A GJ:F3vkV?dr"oZzl;5YoWM8ȋev*ܞtwnLdm6 UᮺtMPlf'}\| u=ҺMs zW(GPC k0B̥V9FGi85}mon<gB%벎@Ԏ!g>NS̯Xب/۸fNhnGCN9ERbōuk[QVnEw0-#7Vu-'aG%2wKZ!ՇlӇ5> WM}IC*jryYQN 6f93?Rl]ݎ}}RM͡E#EۊbV6ݔnCIF]Dă)} |PS>"lQ-0ilX~,<~WyIJ^!?,HZhGAgI.NR -.3UjaHhg'HCLz9&cd˜ph2"zγZa@Rpowr:;O}vAw`xJp87i@ Y9QS “K_~Ŧ X_YR*&+ς`:*Х |Va[zJHȕ qY DRYH(cd"$H#\juj1qo}\YU6R؈ SΈ,&WVI'kzщ8d\pWv7 Tdn% !9Z%Ae%!Z gCǥ=zO/zv,י|c[Aq-zPY MQM4b"$P:[WcUarys,љcP&,N&Udu5G| HM^(D0e`块r?&-Wvm%])gi+W!)`>嵄G:+])|&o69f+"'D?O<(sRBq/J"+0 c(Q=&$sR+L !S/iPA4 !KiA''BvwN˫ +9.5 ik٠i}ІHsd[n Ex(ECQR* ["0":_uCF3rx3,T \rZAmƠ6 hɑ"E4c-P$gi\8"~x=Af$J^kwZk!ˬ*s,fX rX۵9v3\KS ,.Ș&E労3hiDoՄi+RtާATނtܐ tHNd|#23zגUm}r)[':Y?pr~kR>yYSR0Q\O#Ṓ쉜+AICi\.wsi4^& 3:o7]OݧLdw]6Ï;phq(S%;9JfnNfWd/NVgFd+ƒl:0mvv9ZohBSe|Q_tA}\'.8pqZ՝~N7KW,eۂ^m޼9<]Se:*'M#pI}20f5i֊sZBӭ+U ꏃ-Z6#Zo<c֝-(W\߇<gVۙ]("At] ݹvAƶ,7r{OMnP7v.*H cOjE38OdnhwG^?1MOF8=8E ΀!-5l7!-mf1:[iCbt%2KG0UK>!,m AFL1`FE+=hr7U{fLy/gZ{ʑ_ih%|b`1;`hb vb^ɶֵ:AlKWuzCH᷋2\g=7}|Wf縮ȏlv ⳷/槣GFM}ݺjy vs,1BcGqyVkN\ ] X[Zh5lH뉐ӴXֲH}0^!r|:t\\zA^ yHy{|9a(z/O?=L`]a./N ߿}WK8 )Vd W }esΆqC+_n NKB,Bd. b(/^ϖ-lOl/^udx98=:[8׸?|?[X͆D\{{{㫗3h镨'ǿ-e{E>>1[q<,F;69 UJ<v\9j'ílv[;P`YT EȕTTTJ/@k,)S‡O[}XsmK5|v}xĎƷIjz閅'_iM%r7d\0hpE2WH)ƝL2"nzNÜNd(>/ ]qw\=-%z[2=DWKm5PIQbl Z k(_('5^[e)O[g-db.qhE;Ft6)B 5b|tFm"N&qw ny4*^J(Yyĺr,{\t/8,ϟOXѳ{,0*է96|_gqi)o>|3rW]hsqW]wW]ʕܕ7 8\l4?OfxlqV 0a8T_^|u4DN]kttToW^QOCĜ.0dGw˘,D9|nr`h(s(NPK$r >yޔ); AI3z ?8M!U^VsX\^Uh+|r*Dh`sOSPv6zfI6[5{wld:}?JE#dԽ']OCX[m3Jsx6)"(z.)"]Zv=EKIo1EVOGz5"}hu ~ro1)߆@~)ݠ>Ib/?rQj-*l_}n2޽.w s|#~r_nr֯h3I*Zo6O#:Ĩ/Z龺l;jަnb%!JKřD2Y$.[zR;K%+ E)dh"zWܸjUvb;L lZܡx{W4 lL+KkN)X!F 7+YϚ{[7)MׯEz?Y1R%7r&թ%eWZXi&Zr~KLE5>7]n8J9IS*c88 ɱuB>)xr)]Qg7 ӮXdug=J:|.̗| 艧 D/hz [ޗZ]M4k#>`TO駍RK&(XL+ݱ.:Z]rtgmWŵ#NF|v~|N-V4GnbzR9Χt>rwٚhyo[7KܹY5X,v0ifWLWT5[UY]NV{`n]o1,#vj[gy:sIo}r7dq*-b9*W FŻ9i/:Y[C%;gOsͰpK%e@0RU-\p98*KqA11|T)$^9KEHy2][ݟKIsBZ`ʸr.F5_-ZP _c4 v>qZTCjK)jS "֕,0ޥDAZϜ/JK2L~JiCVhařu`l.Ɉj(14 S)]}vTZ --}"d#@1]j9)XSZ@4W!v6I(EaP- uOnKP1c1^C1166id :$$rlSR> =|/XR-#KBۘRE{ ׂ S6P&sT(X&|3ghXơIl!QxfDaںz0+CB8u7⢊N$#+DqOg3P*4*,In>LZ0dTSuK͆E+XnsH*#xu9k2OVX((5S ;EZab\G 䴯_-~JYd:cLpQB%O]9; v5xa.gT{?/uH{M5tLf l8k0ƫ _*@ӆ d髒m$8Y84)UF@:qLa`¸.|Ѥ‚BgzPKNY@2 !>K . KØ 0|$@VZ/6Dl*^KXMa1<@FUsaS#tvT%lxY(N k"}~se#NJU΢Qn`dX__bM=l ]rtϠm"P уwK}a(Gms ?L)ڽCX1z(2Đ-SHVR;yzt 8 % KqtbFf,mPZZ&΃`:;(Ynø63)J ?Gu(&L;0<) cκ fzI526 Œ8[ 7]VIGY;I*55LR޺9PR*v}뭨 #XDXtb^+,e$ĿIn$,F \`_nޕ6q$ٿЗwɺEG׎8,k'b E$V`@3} 4I)FUvUVVfUpZv A0R.dLQ Nva5WJnq:tX|uZ\U"'9,.5 qMfɸSՠ62L8ծ;m;mfZ&VXkuRQزhG8cT 󑃑φ`inF[ Z^%r% ˝ةC@*` .-1'0rVzJJ0,5Vc*Dp>9? ++ 6EELY3''.\Rm$l@#L+gba#/7(@aAH.| V0d:pU .# 㱸6ij̣%p1\7Zq`z"=r)!u&JbC ŰIS'86$@8T5/ {i*Na 䘙ِ]xQJ0"ڔ#=Q@XK Kж#Q@8 fYtEv聭t ԃO C NUXƞ}PoVKcUg]t>pU>"",0e| f7) <])`HKyJ@u(+4;`UFw,0p ŝ^zD2piùaOYݼ8q-(DVQVQMYnGvj08wjhGF' fRl[Omݰnnf]U^276 Fbr}/]/'Mkum$Ri W\! I(M_MzA;'tPr+)hTN76Oۗ/eo_xWo)3o_W?| Ο`fM6KA!O-]ߞSͯaeY৫4ǟ,} j6DTT] W+ 9b8Ebp#:̅}B :kx7ZElL07\!NHiHĖ89V$>cĞ%,z.l*2>Y-TJܖcdO$Ax-3 uyExWOHGIȒ!d("r5`rw^)K/vzyTzӉ/!;G\|  ٌX- #1#1#1#1#1#1#1#1#1#1#1#1#1#1#1#1#1#1#1#1#1#1#e3cb35=6cУa3ixl#Sd3ٌHa.6giV}F@m\ ͤWYU dBvjH]ȇ;ۑu;j>9t;nUշaKP|(= ~>@}lmߕ?mmLކ7^0ߜ_zs Ql fڇTf\~SE> 6 XAMou[Em #n #;6X|| 6l`?ESΙ ZA;eDU F9}e"&{^l{>o-0-mnls@k-爹F5bsk\#1׈F5bsk\#1׈F5bsk\#1׈F5bsk\#1׈F5bsk\#1׈F5bsk\?]̵~6txO+7qIJjyz1j R3W;*6Dы=<wI%0_'Wof= vVYdeSOI bxE#D< x"k>{ڛ8k޴BjJk}/^MA3/gKۛ+TS~|0{ .7Y}DT>pInBn)̀>8 Hn)LzĹMǣqu2ɲ8L$VܑV,ǒlKdd*tY%hXk@ׇ|zۧ!Gl[P*KI'qhX!DCA6@` " <˥6cK ڰЏ&j]iꝣֲ r_NtPRԔYp4PeN(S'?u;K ʣkrtQN&5Bv>0"~K,yvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgyvgi!l^h !XEZf;ؼH)$͟"|q305y[ruQ-@ ˏ)\Ɠ_ p)JstࠁOaZ ~zU;<W}dU vM&LU*p(.! rZ~ƺ >ۥ-TtW73T47яu{}oF@7hۤq0n!fj?g?@Ǟƾ֋<;}*%pJ]^N^!Ұf5sju\&iWͧy-Ǔ q4:(v~.G__ .3Dr૸MvHB._ ^M;yf/47V^_pvRҠ?REoD!eL(u_<"^*'cI;& 'Jt̖m r͉ž9QDiNsX \?wowף A0zE"Dy+eYjUNJ?q-ٓ^9`87qs4Z4.M1h`c4YJϵ”xt6 |{09y/ʚ:xfsJsr%$츷*E7."i./ 0 %ɉÚ4( 9ǔkKAP +&* .e"c>=oj=Zs5ɇKu4>+mofjo>x}i24 CUB|3?!&xJ!I+H `L!hx1<fe};eID/(!Qy|[oR9QI,hB-y3x5O{z8.yO x2InMVg,ZS/J>?tBRک&wy֩yyhK-vǣatѥq r҅B;!Շ҇5> S-1_^6@&iqKCj^6র7_)]ݭIhmh;ض*ۚېnY-MjmR65kŵM-I\Ϩu'~3hۣz+~xz7rƳGnu{^6om^wi^om0hnM,۶˶7xھJ-)/ڛ ZCdؘY|~8I usE~}ECFvR[Gt= 厢7<ߛ.r2|p."zXa]d-گ+(m.Z,}:W3m(#iGGEMw֝~Oj}N[Fmyxv{?{N" 8& 59JdcJMBxƲbZ VJ$r:}dIK)'`x>a=Fsg$:'y.|F ,q Ʈa8!d jEJ =_o' ,xW171*>z[+ݕoJ(m@5ٙ`MvɎ5ٱ&;dǚXkcMvɎ5ٱ&;dǚXkcMvɎ5ٱ&;dǚXkcMvɎ5ٱ&;dǚXkcMvɎ5ٱ&;dǚXkcMvɎ5ٱ&;dǚO&zfz KZ1ɵEBCJq:ϻ4{TLrL%9 /ڣ)! Kȃ` XB^=U]y3ӛ?͕ gC^S"re1DV>W *Y|70G]5݃v.(!~ZhPR_i=%m#q%R"7ڔOt%,gJ{_)2˭`N`=3_ײS.ّrW˧ROej@-w9]aYT^5bXzlQ tJy9ܫz15:|>jQĮqo\[XCtbۻ4$ כ6bOU$aǃ71@q};kv/҈~O ǹԯ$ p08[|pJ3Ƿ3yK\ߢC9$U7,CV.n"6:dV02?\vVH(˛ `2# &#NPJE4#ѵO 6zr,K䶘X{usDc>nMyFI ( KiS$)!O:ƁUڔ@T5N+j٦~igO .=jR jԅ@ץ{=4?$;QKrC-V̔X,F"D"m5^wCb *'QH d2j,xO2O-j $R;Ql'dbzԌw[Qgv,a*Ì՟pу/iW S' BE!޸AiХCuw + oo$wvݰ'gcUKY.)Қ,] YbBΙIȢD Q\ZE g=gJ+@\GkwW1m7řg:Y~*Ho6m(oA#A= ?흝PO#.NꊓME`L>+@;Byt[`!N$(Cv)4hJ'y ~Ձ21J|,RHȮ$Q%ސ4Ș: W6g5rV<6J & p_sj~EFSǃLL85%6頍JĒ0r˪djQ:\] ij:WfEM]1k2BXFR}B}TU^k8})7%B˯o}jI/]n,]&Ee?1!dpJp%rQhI&U4ItTqMUX#mEXm8-c=RVl!%Z.kp_p1lۂ{ HN/vۀݎ?,ɟ6 h]l#xF/aErG3;c*xvQuԏ1 -K9Sq_,Q'*C 9A(FƟKJ}Im`_:18Xb~La.eiCfRTFJ6+H[%,ӑMqJ\;8CUzaa]?oy?pG~8a2f\_ :/F& Bˉ( )(%VQz{ ; 糧6zHrQl.bW=o9ӂ\]+wfvT%/v ~*aym. xo'T&!ﮉfrM}yv=abKx\1U`hK+g *DaA4}&m35Df}¨z0jmG[Ym@&(ԩ07m_o?ViLlE\uFn|?s.uIfg߲Ưos爞QANp7|ueYg]ۭ΁Ͼ;r2W2(CL ,j2r6$ H2"B 8F‰OVI_BƠ%ST;s*0/NӄY6121 *&e{洪W*)-ȤU%֘gm4^?6UA<gsUģW%~ڀ5;rU' WYQ?K (J3x_Mgu>0V)\##"9%^7H%[2]­dtv8}+)iEeg<㢃K7U&AKo"R,`néesMg ftq6=jjQ\LlPbtpٶ,A֧, 13dhd"Ρf , o˔$(dpY819A J\m85Q xt}==VXm8FM~mr${yLHX'=mD8rO V'_΂~̟Rf?wBuYUv\g:ɋin3dwvGݷ[ή҅o0Qi~gr;-`_LMnzw3 7ۗ[4>]ߎC- h7Fي_EUNߙ6b+W *(1"Q]CFLۈ9Ս瓊9ͅJrmPTEve Yj)+[ץ卟K߄G 2*} rƥXܸ+w]z}oop7Bz lMu5^=NpEW~lvbK- v>gdeZ=gFʴ7t< pUƖM9yγ8=2'9VXm+(9P:ֶ%9exko-E"Z[xko-E~ 9v6&xsoNI97'$ޜěxc;3KCͫ1sl^ͫy6ڼZWkjm^ͫy[ K n9A~M!vjTY%!#4*p.9ih-})ǷGZ#RBkwKKQdgKg'm& sDId,!!Jr %/x,3}v6d(ºtP-h@6rnuINe2V :d9 Vi Ka)w u &VQqޜJj2SxT=|>?]OW-DPw#]!].\ cY6c%F%.QJՑX) Nzt|aT;`T+P>d)Iu3-)y$[%QK\s !jBKRI*{edryv xpJ 76̀SbP:2{T%"aRK_ũ>6[9(6w=zxGeQ\tqe֫$# %:Iϙbq,˜RLȦ4 ]g:p\*S/.00)]*X>gf'4 'f 8-7IkK gilRF)uVTֳjR: l^oC6*1Ap#31gIiZx%DQ"TJdiPĵNMj"3C>TBDLp`%1T{0ʘ$i!;*1 \ZՠV\ƓN,3g WHa#frFde!0b BJT5*㤎˪?.[@xWv7 Uc^gX!Yy*'/KN#B-%DkӶz;"8Q"%(_5Ude囻lpi3Z&.xJ5d\ᭃ46ieYBQ %5̫c;$dWPC}㲰wMHd Q: QO H&H}?)6iؿJ)bdtKc0͒*A&6ҿ^1+voI31\ n;|)])G4yp}U;lXK'ڄ?!O<2A椄^pD6VaPRznMIR, _7}1U>ihRb]@ā@ DBȤbj\$Z#jZ2a!.En_;y-ioq}҆Hs[nE`QgFiP bBiӆB$\ݎr}fsYV .ud5"9Th9I  4L |2:-e=:FcTuR+yiwZk[d V c1Z"gKM)OЖgJ82ЗUuuGuqajV2%W_j^t||4=8Qkv8SCI!h}UǘFXy%gp JMf?ku/͓7.g^g+ĜUNϚfdo)`U.#N_j>PbnI[l-w5CaKlCQͣ`(|T ہܴ98LuV ZjɮV;$* I sX>?.U}9?‡5p15 X5 qgON~xߝoN &_|x:a D?_EC N]GwTқ[KY7gRxww~0q)ku{7W_IF !C*_y-¥k7s@TQh*\Hs.f`vv\n}B `@^tU._FZ+䤛;m&2'4Ro)> mc ܄pD-Njޠ7@'K^PzjC]L{nU8*@ #9DH)+ 0I1 q;Y.v8ۚJlyJKYU 0QI0B7~UESA6NT;q5i14P~h]6EbMVcq ؜Ԕ~o~#=CVp" \0-䰿V0#*)d)!6>)_ 1&\H+,L7MC dlRr!Ƥ^+mFi8SsH[cR;ĬW逰jRz\TknXe;%ˇ^.DppV˓Y(n7HHPqub C-/05w咮oS5:D*5e05(\2ꦫoA[r+A1o y8Ϳq#m-ٲYpA[ ݇m/>`șhgHkbp1M+lR!KlX Kn`E-D[eX V85DauwS1j$` If #u|Vy}4 F'Ϫ5cB)7j<[֑aY)8J))$d֚g#~2ڛb|0ڎX$[l}jm6t/Z#3A!d>2K.ɁX|U$!APϬgp.†) E Q+'v="EIްR%= ̂C kȯ=u5^[svR{[}b'H ˺Q=|&\sw]=G_lm l K&:X.&9Fh=| gn^`@6X"sjq# Za-)uK-sߨ\9\jLMXE͊jhm*hoM鬽9W})W_r1FE-ΘS裐a QP1b:=G GHky2Lž @1pxN0!:`ƭ&DČ!yN$OơL4J8lNJ7)b΂wDRŤ!93zsOӭbeHޝmp[][;jntџK&_9L0ӂ1 l'B@ c.Q('RV+3_r])Cqug ѻ<sǏp_>xRrcQ [W5;mWwZ7tYBuo:frhtwb#uY5x%vsDdjfej6?M{87o审h&~*}PSb]7X-c $W R6bJݿa 6ttNe&uI5 [}""6l\{r)_+̓q>P*ϙcW1f HRX)%|gVBtUOx;vYcm QHBr-lq ̎JT1]G`L/ K+r¼g[_tpz|Zgą{ϫ'Z^{]}R<YjCx<:FxSOWgњ':Z|SU֔:y:n5Bލ4ƻ{AzC@/nbɱ:M|!wѸonоNr3r&/˯>Wgyw9ֱZʄ2a^!X1*1j 3ꄷ )c$/|9 ڽ"^f~t 0 ] "30Iӗ?(uLJO1Gix`,(U2xi23Ń͋.:xHJ ǵEpi#J.IB/9Y;lX.KZ5&IÛ >1K_ Ep>yXuu)c Q%Ύjf t0`./2 c):JcSPItdz%5|<%7VH-!q-C,8IHrCa~*-"ʬ u4D{S(y̝vX-"3:$4'(xAcfM|y!N/ZPt?Vj7e΅]gcn?QU­uցbMx*C'.׏#˝~ɑrGxVR;O(* })eר)eo/3]#6E}Mҩ&L3]]~hy>,߶&1{5ohSu\T__ESq:<TmDO13WCourlRMKVcq=I_3_,bR{>?l(O}qub nCm[~Xo |]?ߪO0VoRؖìCo2!j}.F-{*n k^|& PB hZ8U5 ,EǸ.O'qSX2ti8+1ׁ:EWYp IUp;ox`*ƌƁZ)sĜE NVFG n3 R$Jm|?L2r ٰFy={&rWއ͏ZaYqr$i%"-Aܖ%ĖUX: v0U2STeMU@3as)ؐV,O RVzQcAd&)՘͛C`B<-*Caj3jX$"﵌&6MVHѦ*Ѝp- X+{_cx.J"C%#6H' f&} 2˜n IMADQ;7J0$*Ű[)rrsF u4r 4S4G3}q/1r#{լd krZ_AY==viwDcYgΣ^+Zsf4me0BPFa mTp!85Y9P2zEI-b'd*}Ԗxh ^`AMڒ&Žs?ġUl+c+^@ z^8˄;5k5`\V~S4j8xz'؞wmmFIeK&/هV*HcԐ5V{Dɼo\K"A941LA d ߽`"~@CҤЫɬR6QnC{.r&JE_Ġlԙ36s "2#NUh$0Zc_Ԇʨ x^,Iz^ZE F4Rfa@J`Y!xP֖lW؍nL4o$ )dbH&&OܐF 0ID@Ujl^/r<`<D6>ED^yCĆF T97˄P%h=pVy*+!%Bg|Gfv Z)$h F - 4G- % ZlGĻD.Nki(Ee\ .nYQy%BBѼ$zB@Û#wHORQf g.}+!-%fe9rjElssZ㉢G?>)+=z7LFRE*B f:vءCGi\Ljsz}Y@UcyBeK@HQ#mY/c>{-vThY q(-aIYV7` NlNz z?4X.jg.EO/G *AFiDmVRF#T.lIՑIC-|Ȟ(t+Xx=<)asV ;o:߶jn-MF/(M,A\LB'4}Z+=MYli`>r~i 5g}4A 0 fh'E)`Qӊpyx7G? fbvaҥݓ867ݩ:*uWݟO= Ζ܎rt qߟ~lЬ5 yO;Ch}% C)S3{R?lP)oַ׻9Ay?ۨ:Q_enZ}%R?13?>O{4t;pDN|y::~Aǒ4v=Z6wl'79>pA\ 6`/sVųjv6BG&~dcL0 KRLGL_v x7 Sm| 2~*L;%69|[[>_ W:*ŦhN߽hjsG*]a'JꀤA{<k>TKZ'+"'ˬBUu1we8iH> .FsŬ>+b5Ev  "/gU֊s+R4zpEj*\ 7bJk%;\+UWFjU1XbT -֞bU \Y_\ˋb.KbWJ ;>z񞜉e0S/?\yYkNWϲ3}^p%Wվ\\ U1WKbGኬL6zp5V_\R;/^zxURWoG.V\ \x)pUgWʥݖWoP< dh~+;bOBISϺ$ `L/_Nf>۩ÏOXѧ2KE~eaS1j'crHKϑgH^|2 -o2 -o2 -o2W<3Ur]_aut_gn `_+q>QȊ+.9e&b~7-aS w7+Dfj Z1jQQ*n<'Sڋ9.)zj\vU.gPKkR2D'IjS!: Ix1&AbѠehcY/YkG+`\IGX/Cg/k,w?7m,oQQ=Tv)e\-U^t.$Hnκd:'!;D=)C%fAY[1b79x&M)h%1C@RA$MX 'n R$"Y QM*a5sa/ 0 "V""!bCM#^zрfB(rC8JG!3y>I#3F;UV6NĢJy.8J H&Rp0ʈX͜xwQ{I7wsZ6JEQEņF|kTt^%`4'I&T9 1sjc_&_'#^6#f1YEC\~COǦ\<e^Yɘi~A"2{kZ&\5)FhZR9o睳໤iWZxɛ4c1G,ё\9b,hL8&BT@9CзzǼcמw ǼǼDJ8"J󌲸.SD*@b>h"0BB: Y,}o_F|vgݡG} sW6;OW_W(ce@%;Gc4d8zp@xg}-(k$Б28lL0h͹ J8rXjl<x45]Ἳ;l)#)˟*KKiUEE|-^ywy%}VJvߙ/rrC&Xg1GWm-6݄ o瓊̹R<)CGIފ=\6Υ<4h}q^ff4ts#MŶqX^l+_NL:wC%;ʫ z=j0Tƈw{GzT 66G Vy@FF@Glf!})ČuHo#fRB Ke)k1+Dh5Ƅ̽kȵ ]>Es +4)ӵ» 8}uqG[Nj|zlDԇ kCn&uW~ӡƩ'CtwfMNBrbp&[-%bfQ#ؠsJ r& YJ o*".TC Ba܀ cDA>!(d9IcdJAZoI9yM[Rf o7G`5g>+˄|!rKnMmb{zJ0 bGqZ"'wU6Ÿ祦 HaI+b$XAրӫؠg-4ymT!!x'ZRCk2'NV;ACﲅ\ n!=4 qL)T|sHQ/އZ:$ð+2N0 j!Ӑ{n2p.g _@m3g?`]?ny֪yiMџ_7cX& mmm)&%W7ɹa$jB@;A K 1z'=AhXތ\N pos?`s*E `<$57:Eb -"5Ɏs;b4`Otڽ L&\'9zޏ#䢸3Η -OAkQ"QhW v兯c`k?[JGw7؃~ysCyS7*%ˤ.' Q&mdpJ(YnHIKnguY;ཛྷ*ϸ1D.3Nփ a`Ho9-]MfI\1O(2rcqfJ¥, D'Rȉ]5~uFNK91kboaԜ d05Z)X T!(L)RJf#^:#S*EKknq1%I-hmGp: Op.I%RZA-?lm˼3/F3&Hy{+psHR>FFG'c:?.Cp+ލlQQ41h({j!28B#"[;.g{Dq,d\ɶ_g, uic֦w\&B8S\>f-P') P'ѝxXG"qȦ!;-:A/&ycvp8Kt!|^gw Q* O}3p`fKg_; @I&R\RlΆhJk)+QR<Pv%?*wFlrBOv7LD?YҁHQrg2 \K>gq 8,BD:υFGQ#e {AGr. pCGhh Htλ"Ն9uv!<:9\,K- |0*hAT 9 wC<7Ah2ld 9x! vA\+hf=ݰI\ϔP)*)M*h(JD(/D<^[A}b(dܓ8G[Y$h 6#܋(WTqub+ I$FrBVÊGO>Yr9驶6#5 םGctr)rd!X$J #@<8 Qv/H &O FAtF9CeuE` EB 0'")cj! :Wܪ}X>璿٣]?pAPs9Gi1s˔q*-i˅'Kæˋ셙;]')3(e&8OŊ9=`܋E4{0J('wNz'JrN$eo2E5x֋SΞ5?$VQ WH''JǛepUq8VV4G6PbT8*Y{.:::.~Tv;·ۼ6400j=8q{)n!q||*X~=]k]tEpv113WzfG2icΧv/{Ruwjm70Q8uFe3ѓ>gmzmk3u B.(IN!  dGcJE-FukZ UTҨDX7֏ ??Ev?}Wwǯ~8Lwpg`=n u$0`~}@@/k_oee\W○^ Û\u$QUl+r4 d2 EhRŮ*.CYk'Dw5!8 ̈́p -B.b-ռ!~qb~*Al:EB߸*ł`Bxhx&IMFqݼQHD*X#-S{Woa`Z.A",%> *=Xh p@hr+@s:) ܓN';[O|%1|ʠisq>ybZu,Jۓ!Nq_U]+IúhN2sW5462 1\!d]= +l[$FΰMfi Ƶ "ur)\ІD-MaBK3&H|2Bdvα,a M(cvƫzAtE@Ǝ/qXLA~! b8ɕn떾^їDi- ofןh1;5osZkW7lKo{LHKeѐ<Ю_ q2M]-eEH.(]ԶD&@8e{(ҽm#U>M:ZA" Ƒ#B蛓,/y"t.slsb| [.Q‚LMk=×-kssܣgBGZB):eYΏ/0qrDKZ--'d&AK>'ꋂ>7葊knDKыbB\CQD/1F@`&FDQ߱`5$儃6qI@ZX(b@7hd!v^'Z"IFb.gv*a?}!znb0) |..=^BK =PK\Uwu?bb/ί{XOB0np: {9x(Az4!>Np&A*s{BqTUf/ew|o3 aFZ~tp^潅UpeuRG'18bv<-Ϋx8;(^ ]D  b3$”'8{l7Jcd0Ӝ?D8Zh0y+ZvAzycV-H۲ZW6[Vb >%*?e $Wi(㊇6 4ze9eKr>ͽj'CĮxdKe"k+]4RʚPi.{-$|ȻIR-7t6$Y@=fn̪[2%4[i5.ٳb^e`g {qНd_f=X 7Lv_Ybs1S^y_zuvoVϴr#ZP*o>lX⠨~pZ'Y{h6m({9-G4I}*g!t/O1"n_=g#b%(~CZ1)qkin#G= t퉘ø'pŵDIJjoЋ/%a˒ De2$2e+N"h"&@0-0)IȌ62bbUYy>5eD3P) %B*[Vj,@L^bp1zZMQIdC PmE&} Hg<@:i,LuA:;iE("-U>Ԕ*.sN$Sbo,Oa,5ڰKvnYXBEF.*CMV _SK@K &e_'j7mndv0RCK4Jҿ^szqLqہXA+Ƿd' ϕ ;;<[vN[VI %׺@}-w@Zijd"D-I4hpbY-[Y;*"\zW sBPDH,42RETcٍi'NƇTyX/4'jL5s97ٳgP2R^A>B)րɅc2%:h :DSBb Z荷]`OӲjɎE Gpcؙ8{@MRĆI_GQ8gr!nvwҺLs/f'67s1Pkd4Z2 U)S`de*1@^))JA2)˷6(#,c(X9 ZlTE؂;gY'l^]nBhyN2-g/,Jw,Yt/;`6B-@4oL*2ɪ~og=smܲB[v3FPW/\?>:Lcx^H[W|e}7. ]yNf{}Z~ҹgZ0hwt朲itNhd07"[#E3ÙY8s R> ҠQQlRq6Zf`AB>mS_agh0䃭QRPkEc|20Qzm2C PvbS]At=}=?пSD7t RHQB s6sGHB4.j.Q'ٝp% gTksƨ@z,?U`"c#tFhsc0;wɈeT%;$6֤%YO ;(TvN:ZR?:f3qvCAN22,,.u#+7$T9-;`;_NʘSrHLVYߨ]IJZ'ȢĤ"D`zr_vvPATlT|= 4&fb*!F<] &:Dp dE&u ap\ ʓsAETx-|D[D бu&ΞvW`f?mST|(TIyJ>b l J cHkmf1,6 1-Ne=c@YPf613loݙ D4a;#.^LĊtnҴMVr^AV=Ѭ/c<]kTǟ$VC{J?k!ɱ[x'mlmh67U=aѸQ~YólJl!7uóa 3)zhi!ƆkS^ePW9|y1S-Z։;8|<]+?օxF1PoaM6xOO[a_l5]֮=J\L1R) !( 0l`h%>CFLRL>mj41$u(cuuvBHw5o4:`y3B ?̓zA݉~ӫ>W ʾnYaڭUnqM/,[> yu?=tXoL_ߟl{2H <Z?ϪMxZI+;52RC,j&"C ֣ N@ks`P9d!)2R6&4N3e C%bEe(0F*B,1B_tZ2 {,*%N@Q/Cu&e^`TW`(>x3e<ߣЖR8S'=Pג:J;Эt'Q}!u.$|uRq2Nr"SaIƨ r2FE:}־BVhCB !K$T5dʺXO9kPz dЯN [vᴩaύ~rhP`zv| t9(z?V׾#ϭl2d- Pfn\d*P d6l7B2||6ùO Ǡ{p)GAYJJ6%ڄŸ`Ih;p8Y:Ne\-eU>_/tezjxQ |z_{Y{E3 q.XF_30Fbd !+XƖ&!nEm=“xˇVw ?c9k`eB!!Bނ"l6)@"t[ kF),58T{lEe8AYk$]FSFP%cЙ8({?ss}SFx_'n}oۻ־Y",nFgtCa#+}6#e8 l.a ͘",ꙡDRCdS K4{~]U]]/x$h5#H`(N.1DCdG#i+&*4遧 '#KC"t5!j498h EO u"jE3HPmdhBX=:V%]ao}B'tEnŲd=z|(ͱ׎F;aŵ,붶Me=iת@mm9!53 Z:{vRxKqjݽ(:3u/Zʫ^5D@"_'< 1#jQXSpphe&PV;:ZOu42QPw:OD y*"#g,+ .f( gsha<+o * ه)U̞sdl&m+1D*SB'(;=_M|DcElr9,3A ]XdђP%:&Go˨@R.0t_czk>Jj,|>k^z" >)BtG$Br"y).([lT^i%G@O&DJX+- DFt$HBD*q*N)}8Rlؤ'INf'ר`_^J҉b|ܓNf~(-.愉ȓ&1v5_^Sc:g5>.E;$j&UI!1rgЖH+%/l&EFU !9G[j 'BB2N iARmIPbK#c1r#c9]㌽pXx_L{tv55YڇpFu}?m2>2bpBD/@zSBB;Nx@IB Lk= eq4dcFs \*/ *mGI L#DP*َn< &hc_ԦQk/`D2Qk*`7Ι1 >вeI;-(!= &D%L#!FhaI<(0#g;6FRJ QR6h럘* v^~BbeV6{\5Bcrn5i)Gc= "h.Sfr ύ Dhl|rhM)wu 9n^0?umE6TlΔ'r3`<8/~|t6_h9Q~;l<9oP )Z VU_4vߡZ1],v1땎ps|wTP;}d1D­qYF1#ӴϪ7w jg՝:g('MȲ ju3n1 5\tn)*yAvjiNϫw "Xf鳟m ZOMu/ùJTT_Wuā4\Wzy-qmYVduvnN/nHFI> {{g`w0`>$KC~JC[{Ȅ;g]ļ'g^5`?c5!{\]Z6N"/%Rs] ؐ)xVWȦ4P7h{pCb }׎uM8HAtA7R  KR["jf88ƥJPJDD5))HH2eY#wm$#Dqc2_E3 / w\mF kLB1BrNc")p`,%\*DːG\h*mHIMR1K .?#:Zjȅ:A/c]C&Zi)HT!X\%Bjx9T!(A(7ג@TyHt $ ۘqDJW$n4IR"%Cez$lu}%*RvC ;hڢ㶎?M\3-.HmL}4-SF$HpKQH$5#BCkǹ2D&LP XK,͵,k AYM=wm$W+|JV/n&8%cPW0M*d{o5/,6IId էyOu=JLa6x!5TiOa 1 yXg9U dGo6dymC|-MV^2^!RiA1qkPQ@mx`aZC*eXq8-gҎJTKPyUmb!Z$tLF ANjKj,Jێ A1@J# IĄ"9-+d^0P>X\8Lt~Xeqƍmxq `}QHVǃVmeEf06CƢ*YȩՏ/XA}(^]-fE 8-&sJ uYI N;>ldƞ7q'SaA[w56}m6@@F84=.-ŨuԆHT&oPw@>bQV]Q@R@"(ʠv %tMކVAhGcM;fCd`: )N6G!c5ףjF4X%Y֨4j|nC ЂHYcvXd5H@8KMF|d@!RP;H-2j?`,*FaE^Y`(#܁?Ub/'R% fcI;a0Q{Y ̢jTRpTf)4ͫFiPrV-=+렱^!ZH̤A@ WJ'y*0 \n F`ⰱAsnu>i9OilciImp`ڢ+Z`8FL-FTai ,E4hZ\PUFCoڮ͎Ly9 ỡ=7Ufܵ9n(!/7C|樇ty@1BK6\rA,B4EȄLՃ J`Q*@%AO p'r5쀱U߸YúEcV jV$bʅs urp rm-HN:A v*BV%FJҨZ1⓪``=3:27iâHY>zV19E@KkLa֞@jXf=_85B'n6Qr*1k?d:USEcP?!\,Y#9sP*mNhtBe_j;J* XkTacpho (-t5E (+^IiDF:|G |sksڌ!$bU.u_n; ;Fff%d(BECd](IShI҈56ץUkT{"!(Tv""DoaL`j?®z?f-nϯv ][աR]w Q߼g?ʼ殤l;$P{ 䙓ɓ@8"AJS K$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@ twi=JãKx}ϳԴ]u._/;ugi~=z8 lۚ{}56$e8{,68z_Y,ʇ3ڡ?|w3l7.|Z6u5Qr7y sN' ?I^[ίtڷ7Z\N w^~i[;\2e6R8%ZfzW~ؚ̈B;]u]v*1!mq^PqEoU1#?e;!#\f>[Sp":%YժK #U>&CM7ր6_f}eTmTCT<JO<1̱f{mp]|֥.?߹]z|y :dheWkA#*Uamſ;lW)"z׎:86'=GYtj:'W8me@T(IWXGQrjyWi!D%[=_ޜa[0mz_JKvN 5Qhq'<\}7nwb1ݫIXʪ4GI;r̹dYX{l2,&o졸m_nލ1;Wiݗ5PmlkQhf\WpG˕[#߰g~Y^fHu7WGVVOߕiuw탟ŶK:xyy3p4}1_=0\[/nf;ݯ8s(6}X7ϗ%!_/wrpkJHisYֵy[jg3]ڿ$@\˞'/℟uz?Eyӥ =ɳe?7طO*ċmRee*-Mصu=MD1$uHXZ`v=~_ɩ i "߮/cܗe|hMŰ@,FsTwL1OPL*+QT "tw+B?{6ATrX-6{Ɍ\̏?l n . ηZmY} ny>~wB[*s $%ۙğʗƎ 6!&mK:_lb3wYC[-q&9/{bnbspهO7dy=MշЖ(FUUhGM=,1Um)]%֖Fc,IQ\fqq{"VOJOi?L_otqJ ˷s(&j[_>פ?U |4"v[e @eٶHjԦV TߕRl2Cä|y(0|B1y2.[a$h 1_P*S!qNF3x`)m{8{OLJt5>vʹ>l'8pQ3{_b+]<i^սOfuxW|k6^06ik[WlQ괴@#^]{櫳x_Z ƌ0(>pnug_m(tM<1ݼ3>;}7N7Fneh%NYuM`j}B!͏xyfV^{Ic*g rswY[a6otu4_c&٤Y'>{oww8ɧ u-u.Wn}_hSO7qowjeoF1A5Y&sɛÉgu=2;ZͣBwP}6.޸^+#C朱R* QF%e^Kie[%$UB#=*q7x>?8Mn{R>y.J ]|Zt{geSX9UnyU9zG恩othqkz8}p$D[(cGԅO>V=e&f?)}w>/DGWFO]| 2yhwE?>!4@'M4>-S.\CObs䃄d݅ώkm#GŘyh->,3}Abod9cdOQٱlEc eŪߦcJт:Tά P2[ɠqE ͳ&1("[<0H=vOy5'b%0g#YE^;HaDLOZEVr~r 2H.*#s %27zH"9svwQ8^hِvH%掲lT{Fj1Fqv"Ha >{&30@$>uoWe'.q=DvNeI{֕QcJ(Q%E8lzTLbVH W7iKzq w|CۅՀgݎkIW"]?$]F{{,<>i/teľ[~(#+*3g*jsVyjDwY!SJ,5( tG~$ZB,!.Y>c Ι~P ՌUf,JKHUP| @s![-֊>c[aOӘ'Aw~<|>؃rRyc&3Z;ukrFKon%ߨ m55l@dUL$0XJ&J~V}eU Ɗ+-:B@W0D!)K1EPFG* b;&Ζv6?-Nܡ-WmLbB4TS`1`H=qf QG6!h=Ms+G~d cc!3(e_Ccdj(5+k"bZjuZbOFmWKdMFDPtT>G[}ml[ NdȺ0+CSGDZWt6 xf7Uhк(3 k;3E M fAX:WgDDsnyѳ}1n*Ʒچc\ qHy̘Z ݼO \N#LhcƆrþXG7 :|g$ %O2_}z#9eJ y@,X !6-r{v 6Yn jj_N(51A;-drfAJ09(ObGyӾ(7BP\6=7hg[cFTZ"4) vJ*})ź(?̬V HǁSEAC:ƥ +@g+oB<ɉг  :5QbEB%e0$@l 0E1;OԠ *%|D C\jWAKOz*F !%5dEpC > %{S^Dcm} \ۨ?UT bI%gj :X,&{2[p\4]hTGQهi54mcH{J)VlEM b5F)`BD<R_CeϡT2+J)`.(>0 0pdo8"`бZMXK{m8ufp3j]yJq^GGu?sWrKҳ^vu/zWCob2;,ۜ$ .~ۃAR:Ճo2V!_}Rs':`qz^_T\q~ˋW_\gs9ͿLgmسk$hӟ./y^UNxgP-M͘@̺,fH<=(ELgɇeG>yrzrqùUnuZN9pf-/-5D}zv^ڭ/gA >҇l/&7 :qS/:NkQ?_/?W/?_i_ߗ~{u$=pEM""|^OiwβNߪtSɬkRy#vҐ@!S:%YM'ru{u²Cl-E=ŕln=yCٮ |܅DANE9u,Y< |y!%Z\暂Mm|R,슲n#OK粇;Nƴ硯z^@'" qYREdI-AdAMڴW l2I-j\鲮Wt♄tn<*uX5jXVѫc~dzXRU)SZ\SR-Jۅ6>T93_SUd7"h^ڃz Xc6F0$ud]0%*FOɡ2F움5m en_^*rBs|nyzp!srxppjko^kfz~4祧?=}m29\{kVmeGvDZ>Z"gPR;=VȎ`a`V'SqʪW]@,ei%UF$Q&UXE jy[e(+}<ydq/{JGu]o3zXi=n"S2~ QvP"kYGtH((\wt4wbqݔS{a4l CJ &U@K#A>q;v$KXW5ʥs)hIg H9RVYzR&D#jh]IEGNj4]sB,^Di3D&q 꿊Z-<7VhBgMZh!mƙ_oh5=ۀb]1#^}$1Oypn:ɯsӻ XN-(Ņ 4 ȖU3Ȯ˧U7:f=R!mV VL%Fg!)vA{@)]Jk'7cXgrs;Lu}s\W(`u5O=[N#ntgmM| EPͶՉ6C_l \3T%V"\ijT &5VOu@퀩XS3\8 u]_g`:0e=9>hk'CT6T&sa'm4E])z)ve䮏]RGڤZU5G@(qV:ɎF<o{|d | ;3~"z7 #z$H%l<6*q /DK1=jsWXC4$,`0p1ճRb=T})KuLvjLbӵjRlI)`+8FO5 SǤ0V'|1}AzWݶGHPg9|1dZ@<Jfm ɅM8hClP3=DM{ c-{ql3]JK"Mz:RRIkoBMQ3wDi1L"4kSQ U ԹbN6'URM-=t /%/ɊF[1lcm3R02r(!:Xe ꈃ As߸,i)TUSaWNO"., l=:T,gKc7=)SuTVp:ņTZo찙ő\6\ENl͖!pO%حW"-R zCJrdĬ|ނ)1"g1TڌF\)PMh(3ZEF&$ssgUn/zF_Pp%Q{ˆ75.Χ=0+Wn CLON6 ) Ж\*KŸ0A8lfV/:Bʍ6 e2"k۴>A% UH ɋpU 챻sƳ)Q̽AcW;{m=z] KZХllUs!C% hK֚Sj mVWX&i,Pb9!V6Q@íz)$R!b?&pN cM?vG4G=ݵ-cfєPp#:@jXP]FR1SZ)@1;!-0$,سr,E`sPZ[t*Ĺ#Ql2}%E//~.VpZɸNL`!--ѴObB]YA/?Cw.ٻޟFs$D|]_H3w7jtZ=6M>~7!$4 !85NuUyrUźYӯӋ_# |{s( -g~4 m؛t7yA e1WN3P9TRPq}ű<褦;ʷ@y,(HVsԚ؝ JO-V"RHIB1P̔0E/lvo87="W ǜx*EQΰTJUoe$GY#IS9Iކ2 2cf+RsɦX\f<@_h0C2̙1Y({9:f/\ y0} Sz :Ӂ0&aG+*6\T:ĺ[ڠ ([tlJZ])XSs ׅ;]_||z3n6SC+Jwp>߆ac6P{(t礼o,cq2")jp;1gSU. OV]2#ӆU@-rB[Id΄"Ғ<Ț[e)=MfJ; )_g]8L=4ԧo2 Ej$ /B%zQ&^hJ7r zk]}&OUM;+Įؕ\wہ<udKNx[Ģ.[O!u%^Hi.ɮ4JLsD{!қ̿: }@!H%a "Uu5]aK 7@3]:`D=q^uG?aYKmO>^΅~`ϕzC+}ԅ/'6[iN&'WO}4i\tV @"U %JU$3 $c u씒4`򒙧Xi2dRh .3HV)MU!f3q;q+x|Q֝ /XzdSBY>tR8wBߓx.0Gj;NsP9TQx1l*0A)IDM$2",3ˢ1H/Ym ̽s5x鬱M~ۘ} ϗ{wꍐES#:PkTwg}u?ں=γ=.lf2w-;nybn7zi|K-z^jY*[͛:7cFubx]n%Ҩ7E/TzM^szAj_8pg3%7wh_.>VF 48}1j–ލ~γЫ(h|[VY|}="#\<_M #=,RR?`!yT?ޠ^m<~x~?{]UO3ZuF?~ՓmzzżW8Wӻ~:-ŗt湎Fl2 < };2 Qk%;hNni S?aJePfpOd#jVH;rN 5ɨ0Y_Ƶt54D ޞY劣Zm/&<tur+tz|;Q:o?t<ʗut]8@?УYџalӥHkHV^Qw+3g_5bCAz׶XLPCjC5v*PV6ĎP] c]JC7yr\!\d\Q}$豤tVdJ,a Ġd+p&E BT(kYqهqǺ*/2䡖HڀΊHg !&x͖`-SPIb(蔭"\2 &7(xʺ8ﵬߥRJ  Ρ4dQcCm&Tś0,͓wBD3=٩B**y|C= ?̢R)cShclP a1o|@nbо㚦8[R*1mKbדɑ6U 9RXƜ4I36~44cׂqp1½• _*2^|pˤӯSgtt|uqz=vˆEQ:FC #' eNє LVqض8(ȯ6X2'[ BT=v3qq|( Vڱז`7i Sׯοy)8~|P 9 OڑLkNL}M@mE*!R-v/Pi W`Ug)oMЖ%tp-%e[CXW9KO\-w}&jP ՜3>u:cEY *{}P9TR B8܎~8UsԚ(f)2-0K*@c-"ԅbԡ` 9d 6\(X;}V%|1"f9ɒHo%\xӃ \|dS—Ywۃ1@ De1P+H.{R XB`->;(M-db?V 0 "8R UJFUHN&C'y RnXҊdp O3]^T=4SɆ&`d Lk! &,ĵ &ACﲅ^ԪM 0*|~ԫm5Njzg Y^h2t@66A9Q kʎypGB y/դU:$nB2{( zGږ?L2p"+JOqg_"SģHIzeyL ƇhBtt̤ИAoe(r1eQG!8f@7) 1 #c=q)0L]<>nkߌՠ`Ҋ0g|8Γt3=.&7J xeɰ nQ&Q*[>{8{3dXK7yp XFDq,43sBaޗlmnGWwox~!sf~*ws]CZ8l} sl{,|Jhxv6鵫ɰJ1}!AѠ8YA!wÛtW 7_}XJODk)֞[|k #bO(Vm |Tl]=v~buZ\upP+N&>"2Mbs;Φ'<l<1 Sn={}抜I F!hqIZd,S0v,A9/Pk4(MBT,x4u[΅r'+ ~ !*)rLddKTEƢɈ9ϊkVޒ9/{Fc8ˋt U(+Pa%xG'nqIʝZz #ri)*_+7%M_\xGk\bJ-\ 3Oap٧(Q=YD&䅳s9 sH}7aɄL2x9pR:J;YICCcCV J:ӡ} d+ I {jInӳU8aj7Y|9pNX#.AB)bY+)/z n{ǺNyJeKZqBڶ"]?r3Uh,g*m2hV10ϸI3ZvꌫOuE5ϸVc=^;{Pj( uF3]HauvXXɢK5K֠ձZER'\zt,*ݱ嫰/7vn|YRQ%׉voZ24dlq>D-i1ZTL{RI*{ed\b) /n8X]ѡ]L a&wR!!I<kLno.mLCᡗ`_nlqe֫$#G }S"ʔKu82;*B[Mr;WrvΥ2bE&& /ECpdv[b˱1rrLx&%mI0w?9Y+W|bè !"10pm8LY2F2,0JJ 5< ʃI\ԋ@MxBQe?21.kH* =eL,[ĐBIV5f.o>. AE*)l)gDV+{+ S3MAHȓ5⨢8:u\V'^q ?(x sC,'f8Al"Fg"!F'[i7aTWI] .1D[-H @G@Bp'cIĜsVgvǥLSgI㽎 8F`v@vTIضϔ6ÁaϮمVBgn:MW~l:#5xuFY|jKr\`,yzVy&m'-0k뎌/>JYcb7Guk0v*Y_?-kuѫW糫_5|6C. o{0 Y=MKBͻPoOz0M].~zm!F^s~hxq9_Ya?욃m-ҍWK;J"cgҤw΄rLWủAsuVyCbx(Zyk'чBO?9=4;glˇ5gL53-sRBĒM(CNI{A &:h1T3"|x /?޼߿wͷo7^7CΟif)pn?>dA'SvL;#V0}Vx;oi(u?,ϤZW|hT$Pt8Rk{X sALUX.0xкXQwyi**8T6ģ$LClª>tyBwKU=tgR]ic9n%Hhy8jY# :bnJ S HBG0'ptI FZuL)#Y6nDBGcyYbE-VAL5ްL{GCJ먨C(}̄`>C2p#"5zGH)B1SiE#X %.tAX"RzAe ehdyBߨ8/MWiB89 ZY,Ya&C#y918TuN{yot+;IّGp!)CICv Bְ,m E@W˴RF {Xn64tg`u6/lMx;if&)!3AL%{ƷI64R^rLڛbn3'O5^2iG/ѦѦfĤ_i=\hD]d!116Q.!-5Ovy萣M-_[=i}r[u{ps3#MVGGl/7澸Pp p7P`אN(?oG^8ݝ>u$}'>uE'>u-!b)LI2LAJkimJ \WS*|j6;n##KB/SR-6m{z4ҍx^AcIJS)D97mX|,x-2l`&Y]WQ8$e%)*+Cj"rញ41!-Vq`q>PNznb?iw%:[bW͙txi*hk)[iЦ@7qG/4aqmqO{?vet rlq7٢KKo[30-oAڦ| JB@C{HK(x͙[jsQ_5^ùxo<ڻhwᤥ%/d&i ]HlGP/ƣ~I1ͫge2^ s8wt%wޟ $Jnڠ%Ga}qU"Fosm(U{ӂ6ûaeu^JQ3gLLmZ?)e, 7.G8cdi րf/{#[ ޝ?hs\6 GWi1j?R(bf:*8ZC2FF,f3>cػQ6*nd4|Dy?o9X)QΨK&䖗-yv۰=qC >|0%vaɽIq.FBJ(o4u-N&M:h#ș(IKG=Psuo"jK8fFϒbQP@h41GOkjB훈ZΨ9FXncPNJ^7Vh4Azk6'C6n]Ξmɜ%{_Ub!dpJp[Q8RN$TcP (zR(`:fyt)q)*+,2rlmXm8%c=RMVd!%ZS.ddbDen˂Rx&'avK~  F׃>Ď!LA *^P~hѦrhfde4 JlaD\bȩ ^ĦP:(|9 QgM`.STSvGødb EJmVYjNjwvr!YAziHaRj6 kэ*3$ BFd FY 'npR˜F$ hT#Cpƨ/r=6DFJD^Y"N"vqKO*4T]RqgUރJHg|4uT](U6NEf"j}0TRLdB@KN3܇a gD>O̘:QɾrQT\XH Qy%BEJ 1$oy)!N R!jԱkgbuhwp2m{?՞Eֻa"lJJO!MZS R !sW*\FϳO2wEq;B2vl{؁!K}GHQ#m9/>{-J'RKԬ\A %",P1)KWq^DP(gyR\FHkp;TjVFՆ#ػ@.ƣ?#cz_|蟡>[NFs1G#`<@li4EZIPY"ڒ #[=B.ߌL4dBSE^p'j-xJ:$չ6W0r%ID;PA  Kx01h:\u9u"uNsR,( O*GQ6 0ggYcu(W\51%0uZn,35޽z78Iɋ;-#]^?)\0 ؚ`f5i[os8fOAo^ͦClI̵ꋾiO6 L`󜢙CcHޠbwA] (*RҊ*8J .UDAU&Vt:--7.,`du2 , / N, ~t HmSJ||~ux d X>efT3d0Z@sH&9*4YeZ\p28l `79952(ᐩrKjUvg}|\ųg_&mև}]YOdYr Y#R4%Gkx hfJ@ G.fR\tlBֳO)Tq^].Юy֐] 3K縬l;DH 8å^/& p]Ҙ`(+@#bQOqܫ@=bUXL2d3)[3hS> ǘHXǨ\9U !]֥H^gmmʐ1k {JSk;7TR蘼~ZkqdyC* $&Ϯ8|VNWy"#:DXcLo&IdMB NAͬ6y8aSzߣoUsp[GGy2,R^y/m*1#ٿ2ЗM;T՗W'DL\NW=!E,fGLOwuw]]cRWB(>Fo%vͺe>z]wciw{$JW,sQ"pomD,p$U I*.dJHh}'cl Lr#d\x i>P/uP* (1zB"L 6&2jU nn,yK?QiT˼[]b=D@=<2 jMH `4Q#!2ƣTH4JF+&*43v`49{ֳ9y28D픂*PFk(2%}bQ,Aj &CΓu{ϮDY5qĜ㋧zk^xWh)J{idQk( o8֨~3pWĺZM6 Խ-@/3׍w/꒐_J;)Q,cNx|>Xg hp*sll| P&SDUABqYqtVNtÔ숒'Ubz_Qq}lp"Ƈ`tT̎-t Hԋ 8 R4UE0 [qn7 2 y` EcT6x1d{lqpt6^o۽ڤ{ucC2ݯͺmmܥ]ߣ m+%Q[' --`XDnPC?r궛@idRȥ9QEկuFy ua5sX-XS׫߷\gWV5]fKzH Sߛg7~>-~OhO%pJ+qMJD`2.޾s#ُvp;n?4}vM_/4t)ȴY^7E{Wج|_]U4jzy"ׇJj\*^_ '^4Oy'9cwx/M_ŰCOBH^t{,aڋNɏRkG;-+ilr)nң{޽B-ܴۛ=k,郶iao a+U1й<#:҆F# D,+TU|;T]<& U{TlH*i- +miyx^P,DńYnn;g3>ʝҋq’sm|r7T)HGi׎ #@T;FZ;6sʹA[fꆏ wK7U!,PnƆy;ΰM،x3D.g/oq,V]wu/3m۞#,9V=]lź6B#8N"w!sA6PJH cI2%T؟r;ޱUůB 7xv` ;'+F$q̎+Er)DMY9߉ZխVW>VòO'GTr-{De!zKX֛mAŷé5Vv8yq2Oݙ9wʖX U۴-,5.+Q +OaQ h'DX#RI9`+Sg\Vg\H׼mlVi{뉌.on5JÐfhɢTr )Bi"% NppnOEaw,`&,*r Gͥ8-GTD4d[X(S޻v-*R+@ 29#ѸaLlQdJ)?o[VZn.]t!ʗnkkd@! Bs ͥ>eu542J]7`%/o( yGZd4deFuQIQ141Jc[0RR.\(xOg3nmL2%˶#B X`{v=x *-I+Ƒ EDTrLrRp=KBIǨ.rb?ZgC:2kboA|-3&$֊gJ3!%A7;*@RJf#^:#S*IMͨ7O tNb#J)2qhu$,\9RvjjIo_v{8Qk4I nx'i ,[ݘ  ,g0:.k '㲹 |GfF6ըS(PFh({j!"8B#jη\h lQ,d_k(-񉖉 uĊCѦw\&BԸNS\؇f.') P'y̕Baȶ![-:A,X֩wMʪ1R$#K!aţQ$?h\iByw BɥH hMP` H5M8J자%m|~1%Br 4@J *p<<+>?&9 Jh7W~gk_v&FW?Z. u#NIQȅ(sݬT{s~SR}/}\s64o*KB W1QMq'Qc*7Ξ1 "@vF us$]vd3e%7ZOHtF:kxbx4խn·޸7i.,:10j=(wEH.T o@+3[ٽVn7oϧe`$s`.~ug+be$h*s۟/m/!7- #]aQ8$Aţpgl{Bcv/&'Q}fF]3WA\b8:'6>? F>w=#JE-:׸vS8묳n9?~yw|7/oN>>L|s:Df ‡\/\3=U|sv_*'ŏ?ëڗ͙T劯j9jݬ dEh>)f+.}m-B \{T{Qj׌qQ-_h SZ*D^wHĽ$x@YXpLH/j.q]RdQd/oh]KsNv. Nx-"dA/6e*Q9MϺR @,|gRLN,tY15fy^ hKO?IU31&v|2i#38ʄ*aP&~lb;_uXSJsVմ[j~yVɽb&(7?W$/ju-~3:DYnQUd^JSAFq(3u8|f'd-M]ճ"heJĔN8S&&Al r!Ew;˟H9w%X#KcC*sY.PYM(>ή!M6A CzNr5ךYM.?c8'zuR |[n0_:(f>;@p췫<|OOѸ7n!"r=4f8vaܕ3P"T/Z0UܵU h_V g 'jc+Zɕ&rCQ0.QupLRI=<;R8b)X2{!,QH~!gSsҰ=k:@% 0M9U[Ƒ נ6p ja~9^u+΃Y͟#;> JٻF$W> G%`c€glFٔLR-,")EQV;/"8yJ ͕9b”[LӶ r _tg0wKk}. L)]'jD͟5O ՚?QskD͟_'jD͟5O?Q'jD͟5O?Q'jD͟5OO $-pg'ߨ5w 8>H=B_vƢgHBG9ם#H.2݌Ul'?oeYtkż#yJ=_6z<ΧTY˵pA4q  ,Θ |,yk5[#B^huu7Ɋ9u4h_A(|m8\*͜k}ap`7:[·%{W[au/iRD(y4p &sAt Ñ,<+5GE\kHŸNzTUbսop2х`ر同L( E`u^PZAp!#R>,CQ72΅H©I@e$Z⩣FksN7SRXa7T1rwCmj8Td(yK;o g^OwJdof #b@QX[}煠Z'&g H7U›'GO)rVlN,-Bcs~~Ďav -QDgPBP}&9YH1 q6g))&UXn4to`S Rɨl &Mcx[eY1c zTzd{>*|3>% 6P1&;'`@>|b) ϠͻDrN8Z݁#vO"1D(fڂЪyh ׭f؈*8{aޚd$^::Dr6^Z*F,~;j cUI} eH[a]2/(2z'Җ=N  DK%xSfrzf#Ɵ),Q='}YX3ս| E_}Yԏ䧟|IR$ʧ qMJD}长.O_>?;{]2粴8_X-a^CyV|pJQ );?˖INmV,Ok18z:9_!;$ѠqѢZz- DŽw>c^&36ѓv!ڻZx$lx2+2%4uj+g\^xf. MF@)8iK`]F\ tXLI tFp&#~gO |*W P<.mGCDYDcK%"( W2->QE (́>]4*]7*RK/ {mvyoJ9?&|P77Bw7C>/NUEd#W~-ItɧrUr3svW/MGFew%8iJF &%0DŽhq͍N6p/CH|]rv9\: ] VL!M/Q{57*^*Q/WfÓ@$I6 H`mbGi}rNE <H`HKY*~OuXu\GE!0VvG&CHD 8)ZiS;Thc D-V~ FDFhF*s n$jeҤDΪ^q;v V[Pw5^J2]0e3FBiHC|\VT,*哰8ˑ7Wn.Oi3KSWoG !$BLcJޕrK>H+$䤎G[Ć1ŲiGQI+ޖ__]֡}t חAM#2DWa=g^95w"ȹ|Y~%֦χza򢸜hXk,F`"0PR9L]G!Q܂rQ%l1 rv<$QҺlkx:Y:/ _ 漷hr/[̒dbPdH5($02KYN:Ftkl9+FΞr-bw0jι0Ҍ@uHIB!!j;⥣:2b"cCO0˔\&sePJy @K d 璄Q"(jZ~:j',~v2p%h&Ti5O@ymsYL*pHX`8(^m׿]#<"6{7FF4B2udNBdBōqFD{Dqlr2I_*|-B]piJDp;.b!j|ަ6#Z(NR5*NV('K0뻾HG19T:da鐟fVogM'!M;dY#GiD>Y$` 6),^2Fh^%e|#4Hʢ>  q2ڟ՜-17vR([kJqn!&ji.f̍Ƭ3&H|2 Y]ahWS=%]\ )mJbGE.:{wnߥbۭ>v.:bB.Dx1Lwg0q]9t p60svˑj^HQKDёʥDKyE 1!!¨y J0#"Ȩc,XsppeLjG*TG#u 5.E<IF00D,F>a['dx`ƾݾlẐ_h71i1w޺VŮ]1^͎.87׬0.y-7\ F yoE<U=m"Ԁ'q™&1Tw^8)2W9^^C<%(at\`Rx ! Hh!T Vzc]ڶ6x߻ V˻x7B'qt2qyu5!vdl|ȭs7WW?{7?ގnGۨt{6r>[vvIkkiN 3o֎)bK^3Vy%9l=V ^ITA3'T@["824pvVkjFU !9G[j UOySd 6EزHڄ[Z3#qXY3҅HI]ci_v؞ddܑ=;غM< rß hpxu;|}8H!  ~hЦgZx8_(AVֽWr.Z K֚ab gӒr%-8Bj/"*H SےXrQ$9B&bvt!9BnEKZ|^F\reBzCP4_`FJ @( i׿+v;-eG^nwQguL޿7y)؝^}EWwAZѶ4AhORI!2 d]SQynT Bc C oJTBL$.}b"F hL gR\Q娀ݡGQ?o(yp4)NpSqkŇ,°c(h~=?s=|wp튃dJ{sE.*ʩ1}Iŀ:nL= mt%>% Z(NX5Sv9r(4urSC~BveCʤh>+>>.|ek넓q1L"CVaK2zP^!Rk>fKaMpP~ed^scv|br4 ]MG\lV!̇2-<4͖(8ejˎNƙ^8[6; eB/'PQ1qhx7 "7'e.Wfo:$Pr%ajFxA6Ӽ3h1W3B凙:!r@1ו ~toЖ2kz6x#)YB:w6ݙzL![lR> y|Vv^N*<˸gsP8޵f<"pVTQDKNn:ղp_(O)|_q{8(@qQ eA\1ŭt7Bdx\u`/'nm CS6oW٣NOf[zəkUrDLGK&C(y,K8@j#o7HL(EmAwxpV[8Q*VQ4!24CB.I}aʷ7.)S%%Jpqɑ G+u.0FmJz'|]q(1;_u,h8V "}$V({i[uҗm EJ;L*aڏ9fz3{[}`F)ҁ ,?1R:4|tZХ1&Ә kۑV"9QI"KmEqf D]"Y=4p* Zd7]ΫyBቯ"w.F)] v}Tn -ubڷOGW_/t8 "x4FQ!;xh(P( 4zm%qAs- V|7J,5Oiю%ύk6on꽶"5ėm&vדBݟ%c:lL\ _Nz9A@~X Jz;. %+c= u\l8UHY&VήOOFv)(;Bq(U[3awqM 5kSs/oi5z_UʵFR'm<_NiQ/͡c|@Z_*uYnU/4a1V w?A4SL)2e$pFݤbO3FG>h) $G3PpqB(>ƈ6.Kk 1] =g`PI1( !;'ld28bmRQ$Vgڴٟ8:g_pJhbB,HdO7@bE.`qKOUNkꍲVGBcHAic ׄHt٣t KcytA{!-<hY1, 9凴^KZ%eP"QH]˞G.zIZAom3dAle|jr RY-SID"ˈF'Ret>"`uW3Z1rLs}lUIi+JJsW^I-f%?p<ӝ?I vr(GK-F_~w-!+54JǙ-%%L\wzJ9V'֚!ùmkG&=YR[^礴404E i.뀭"8E;/P'W{Ubեh ̔73n1;"^ ՖgT .Bu"99nC&dȸE/%ZoAvOGS]M^oZ|Xa]ȄľM 'bǞ0'~\8>OqR~G {at>OqyBh\eOX!┏@~kZV$”™$UTvܻ[]43NO䜻rp!)xneX-$r"E2RH%xbJ'A=%* &bs}h?akgY6dH׶ݣmQ#Fd6T&)be߸ŧ˿ǧQ+nmiTTq2*%Ixe4&Y4dɿk7S7|4o[#ʷre>k*y+ݏ1ɥ^k9ȋ[> 9b8lOf%9Z 𿹲yh&ػu|l'\?Sk]w:_/>x҂HEa]Oh{wOwQWȊJ-s*%EVrʇ'fqbZV~2*W%Og׹SY'KCT;:W^+32I{9:E*5V΀$z)ԖJO-+U>SGޗ7/ ?C$ĤrjGSo(S]Weo\%K=!& DBAg,(%8Dp04ds˵up0gP(N!sbCK+\`>pzǀ_ W~ɦqdd?RjhJ&md[J-Npx,# ॔1.UBOH2ܡ%%#" IƠL"kqdw  3.cbLRf1p|PI&y|d~ $h gUaŐiP<&pՆH|RC80raQ.jɐאG\h*mHIMR1K .?#:ZjȅHX3b`K}nGgqrѷ>.h #M^R8rZ8+vO<(wqK+ >-> Dp͗8ZdcdX tgWgeY]O񷏛&֫Ī!Rqxmz Uc'g_0VGׂ{7f<c}zZX4v qOp9j^L25eOhC%Xe0'sCFkĒn:@C6**ZHL.H[U!Sc2Jg r6V[_jTcQBG1 A1@J# IĄ"9-+d^P>X\8Lt~Xeqƍmxq `|QHVǃVH "3hG[鐱Jr@# VP''iVpۂj2P$ a|쾹m׶Vkk\wy2u1qWCj,׆XkCWsi,{%!z=| \ZȋQ MޠB-e}t7@QKDhQA0!5K>cUmQ5Hw,Vr[Ҩy N6(6.63jAqX(&ɀC jw72Zdp% ~XT*ÜP>F/~ )X_N$Jjƚva0Q[Y ̢jTRpTf)4ͫFiYUۣ ^"BZz7j[ 3]$/q6$oYBeڃ!K-hL6>Hcn=[=贘t%uilbiImu`ڢ+Z`8FL-FTah XiwW 5$ў繨)8) #xwC{o>5kc8)QC^"o*$9Q%rãlFm娝Y.eBAt %0(HnGdi'O JHvXoܬ׊aݢi@RW/+lE^H"\x>WP' W`!пFu/ϸ bS!'B(badjҨZ1⓪֠cg b"Zq܏!s6,gɈSZ]FʄVj F5jփ*\#|r9XJx>j9ksڌ!$bU.u#bBv"J$섊$"0Pbɱ.klB9K\רtECPީED8:BA&B7\8.R%)x{'C~+w)gKy(T5w%e)xnY] -Q꤫qmuw\oӎy ܁ֽoc\Hnj0̽'`UD}$H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ">fHCbD`c s!4L{V,@ %@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}$cܩD9pH kԃ!`rI H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ">b6HfrjjPHf$Pr^##"$[C$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@ t}j7F[蓟gmi~z}ٝ\P \w*%/bv~\yKD3o8L}"r{(68b~({ϳ< Gl) W0|]|֥oNwo] p_YV( o Yj}>?ZӪדGW%G <} *⢕]qM(׵),B~M2V!5-AvGmTʷ!αrkW< /fQ妓^vh=#݊fA+!1m-EUjQv4S¤vm~f*J-_1d{z6nk] -p@D?J w]Ob:D̺E+ N_]rGuͯM?ߞ w][\o>x.Mo_laAwhPd"&3lm t9de?.xv1NNJ>g!{ڛ2;:~-AM$˿F#Qb6CҼ/ϳo h,_SvS/qsWnl%w{e[&LW0tm_+_ϧʿiwU$lu8{-hQb'7Omπ'@߯zw X,tzU~M s~vnuiD*4'ߴعoο'Z\M*8{-Up{Q;V*wPZߛZ?O}49Ehr֩L A%+ᜄb!U?BۓIbݹ!$a- !llYt#Ȏw?gnRNŠ,x{U1@)bAJYc%%f)gG6f{6犢#*F=2b'[^Q욊{EDm?KCxg)1xI/[3Ow>/G[OƑ}B,QhBFNٟI| #eztg~o1ɜ'fM*EueS]iDNNACRgyr6e6]T Nnư]![lP t^b,zsTwL1OPL*`K+QT "tw)B;(Ű`[t߭~{cw`]jd_ʋcn1ͩ{]>=N82й;pA۟w[A ^]$1?4]b_Ϻ|jbb6>E&6sK pLqcM,6ѾMl lyTo ס􌪪$qDNRc416-TpvZMZ-*ǘ9͜uEg11OJi^$׽v/'xZ5Q3xez߻C{Mjo(]VgB(PfY=+]we . QD3"4_^1  04@Sgfkd- (%YD |*&0#|JG/CL c,9b`͜,H`o?mOgP:h>v`Çi}"|ۭNpfg.gg\c qR"fZ '",^smpXE+׎Q'O2Cy)5qjHFX'h+Ij_V h"m@ -)y\iŕG8r<^|{fVLҢaFuK&p._sI)AoDA~^XN?Y/K>{[1phb&$Ʈ&fck;9^#!E>^b[܊nM):/rI\b.*V dܧ>so8v EVCſ>ɯ1Ϗ#.T;%M\SQBOi>>?|x6>SW`7N~7U/koc/Αۏ>g]WmOrn6Ln.dǖ.e;MP!cغpxgq~^{i/Nw6 Xy_;l̂84d_mSpXcs>_}x,~dsHΎX0sCg珝RrDzOl2u@dOރ ?g'X%e{ZrϏV*{č}3qrg-vr0ۚdwr\jbdQj{2w #{+uԁ5_Xj.! V`ڦήgҮ%׍r^Oo3Zgaq|~,{4ƒe,yvʖK՟MZӵ-޳M\Uл3Xkk6<7UAQT/QNQ4@mL.i.%1{.%WK,9e=3ݍ׮~xtl:{`kwK|^wi_ X4tq2hryZ\z݌w1b]fs5q~!8 LH,&ji-ZWT!Z$hԾ œ[err|3h|0A@;=B>{JѨ@B+,FZ 5k%Hj0X 5`e<=0eu!hUj? vMܲ ]uQS*` y?=U L夛>5tڌ}3؋3g8fuv31@ypjǼǼoJC*k]ytsq;s1^ S2l;P'sbP<YoJ7)):HvuӦ_I(F^ge7nw硷I#ǶzLŨ%,$(*^Fp:D+P3iTPQ =ۓpu>`Zs0A#2 '? #bW6BN-fye)Χb:Ԃ Yph_=d~=; gPXe{J*NmtکpE)c%I"NhICHDbM0L '"Zh;֬l׮^fݙzMeE* Z 7M묊 V(dI_ z jNE-W}vmoF} uA3]̓.^ .(`TBt6jѡykE'֦&2cDu|vhDT{]DTJ%+21sWV tH%Dӄj^ bۈJhq"BNd6hJ4.+}̮n{)X5a@W}"(":eR֞U #sZ⸿um?Q-,?~{G׏zEq5Niɚ$ZZgB 0P$@2Pzf;8f<ٱMR)škT\ ]I@Z/hGS rYH% eb08.(tZ/IIPrΘKmgz0щ{R<-b28gNB-DVEO" iƐ^MQ$eimM |VwϬ2҈$K]% l13e-GL%"Atu:$cme)R2 ;i&pC][Q* )g!IfFq jN~ 5QC5% @!YJ9D&JZLVB2#ۘ783",ۧ#gC1f*Ʒ,u%Y1)VRv8\$,ߦT/D쓬A^Shcƺt~q=h!Zil{L~Յ5՗%]ɊGD->Y$&Tlb5/6Yl`c3Ւ4J&@v`!i,@y%5)wQB>sܛ9͒D%edTdUWJ9*Oq^*%&&~ !ggA1RJgCYjoy!cp Qi+qv,_yoE&BO'^t*Q,9-l!!BA&$J'2 x,GJR̓:[xZWUKEJqim-d:yg9cȁ `9?eE4AR02H#ʨR0JFӻZL!uO[;+r /_ɣ?7աyAZ~I:]/3?EUs]u>nZ|lX~g] [`fHqxJ#]F*\CjZFH|u!^X*f,^]<{O}ln#$D&a][\ hV5?ILO>T=Ӟ*Y|q`S1I$Su]Ƥ,ov]QZ`0IOpX:8YK_obxDh0#CT.Z$xR7!G!!$}&:*}mH>;/LMb6aX<;^ ,_ʊ/:bi}rU(tF,uBw1ESt njGr]gD+r}&l@o%e+QԎ҄^hH,Uzz RhڞCPؑ@xu9@ޓ%L:eqR3qvg_ 9env ׳s"YiF }v[=:,YvqnR)R_4AIDWCl JNu9v@mĠU{ItdiΔ (0'NZAfK^0WW x (0=EKtpM( ט3,ZLF8;@Ĺ_2ܳmUK}| P7L^vnW/`Szkzn9]1#^}&Cu S8t͛κE Ы )Kb=W!;K]BN#z=zl=H#zͰ4_V~py=(fzW&*y|j_r[n)dzmN^_mm-TR_J[^`줱d4X50|A{d㚦8[J@ɩ&,%L1MuI36a}J3 DK_o+]}x þuYZt'ccF,29i4zP{9*'hʖc*da\v۲( c^ )TR6ʓ- DT=v3q|Jk!`l+kCc L 1K#}*]#:r K ڼMDжooaRt#dFE'-"c!dH8flLa׃=6 ~ec(G8zyOZJS[ĐMGM1JCdH,lH9ڲa{Bg/Ƃ"T?)&dVeG }SZOk&׏_ud85\g3-/~Q~qi|Qk hT(2^fS'`,9z0-CE]B`,R2R!Pa3bc :;mU>ȝc >zG_8/w(Q{aTn2eA"ǯ>LsgvG164(tI[+,uHÀӅN0挲Ĭ !s !|AD!d@Ri_e"hA3t&A B݀q*s(Z eE*2*b\š+)+6< I…Poc!e ̲(<>먿̏0 ϣ(Y'=YӪ]'i$OdOϿ x8p_ ^VXY^9/ڔ^=3qUs9oH_#yd0tՅ (Kv_OC(?>>X6I ) Ki*c)~ WE/7+F5hMKq? (89ƋaV~fPѺ1r20Q}~+dp/|VbHk$uD&! P"z6p " 5.r0w*X|!hɲ ;u^c(,g&㙙k%ig5*ųlzz1+D'<ԧirK+u1M :B+Sd~,@D9 _?t[H ^oJṆO.!2<ލt<04ԮXg[$=$K Ȉ%O] }amȆd}vuF ڥ ̿Eqk raoU/yni4PLc_qm:H'h]۸j2/͜58avjj8g%64H;kޥJaA\*00ݩVJEǸ?Oc>:yy,![n5X<%+}S]u:g ZGK!S1f&՞:X^:bS}Rǵ:Ggrty.fzx~(^Eb]{rbR^|8aRUwGsb(WN{E<% {[D+GЅ( ;e6,.b(1=`7F$ 35RN6G \ bf_Pg5 ]Wp+t=ȵC& :?7Zu0.&o5:Ρu1:/[9nn>uw8Yt-sh9/ؾuu͝'+h-d<^zmdU YucoxHsXؘkhΧ?o:Ɋ\ojO'Ku"rc{͹'}Ztp)q9#{s!H8lѢhSW6GJ^S.$&bBccD ĜmCHJˠB}NwI;56NÁP\o'$FL c ^8E"@% '*Ja1*0pK !!Gʍ^ywۨޅl8*Hts)+b^bΚ1hMC^)J(5W&Kʣer3Ϟ70qA41Db4` PCd : N:*_'_̒C{r^Ρ It& ;!IԹ!I4rIuPT\=ox:H~ <_m~9w7lsw#"S]ZtcjĶhsnވSi")h-RQ>gT #Vi <ũ5 kFkwSld|yPžw.smNn|TPu$]>¼̸SX#*=_Zd.?RyGE00 sw*p>i@!HBp"$H)Z(Gq:u[m$u(`5zLme5.&φ-.r٦TNE ƽ GmC^K(K1$brFc@FbC䩬^@QpgrD`p'n ] g/d~`򘢼þ:#.#K2dLj.fa')p85r4goCIB2/ :fgb2oMFHQOF!\aUzIGͳlip~UA{5I Bޖ_ċCV8TTɪ+YYd"!\LFRO˷o\LJ/. ;~}"S5 (|(ͫIb%_IbwuHһ_aijKQN)9KU:1O&zGwR'ݥߜmJ-r4L\J uqA92y0>k{rߪc*XW_n;GW -m.-?hvP@˿ݼbz.^XQ׿o+]$|.v\n \>lgV[3Aف@cj.va|p3MTn]Ɠw*i[fڗ.\"ͼ8O@˃w4GwF/[sNBu  _&qenF&eO[*  JkQJ' ޼:ii;^/C"z}ur@RS34_T?{A=/xGzd~_G5zd~Afv, Na.|AzDAĈev{JVJkgKq{k0WrCp[Sj+m%smivxM0)F"_ba4;dp F/ed&]bUK5AoY6 SHy7oMm1>N6Y2?4MlyQajn;K%:{k. NXΥP5ڼsGڊw=A(3sMh7;^R+ԚU@ױ]}j=n-MjTȗ zdV֥hγ̢g3nkئ~_Uq|MDqsG~Zo=U=t:qOktmpn&)Nexyӹ.u%ꭆ]W췮Yd3@~*SA[V>v^9x}]k[Fг㼻2'۰oӆrbC G'rپVtզWReYR(npɏ wvNyUDV;rZ=ǪRiFwUmH?C FHtuԥYt>+rw݇kd\1Nyn8ؕ|/JBӑ}ù6]t#G R7Gr[Gfw.wdViՕ,3`.n1$}pɐr]y ЍMù#9ӥɭ8+)9o`ܷ)PVO̖n fpj G'%rѳ k7E&C%٭Bo&궥!/\QEIxHڳӆ=$\zi*y6\γj0!V:`vonPV!4J765Y1ؙY@F>.LvήOar3̧4 B0խϊM(a&.>rܸsś%*nX%X/^sCW#6ͿF|ڢ86Y9\(9 2KVU5)[RR,F`y>\f;4kT$pJTT0+,q01V\vzVjbeXH@.8OI (V9$%¢ľ?ץe r2H(QHYuDw  4豉hj4BZ"VH(Dl[AN7/F t,ȹP!#G#<<1<)$<`fBQkI- n|h!Ke6FBi3r'4Іڄw-xt 3 JGgܕR0 Afd{22jKc s`6xTD Măt b6 6S0).A85>F%Q&+qH!5GuA&v7FN!9rt$,Ѱ,C_U}{}vLur>:x!1iF&tmW,HUFO{q#Y /Ȓ1'B&4O/Ƨ\|9i*ҋqZJ9PJJ5ODwIG~ uľU2SSV۬VhOip xDD_xH,aZjS6z\ FQͤ'>i],>Y0F.\>E=uk,R rSȘgE¥:ittd](TGoOMw_KͲnGdfɣ^f*)c5.dSk|FmxJLv (*6 :ـ-wX q99ݦGDI dr(j]v H2Ɛg-K75[]J2֓ hـ 5Z0+3\i!/ +a ݂4VXʼnC5,(bE(kGU (|N0¤T)SƢ]$)'-ljUSAJJ#LeFŀv4l\FSd|2&d]g MW%e:|=cYb2i2"zbJ˲f@57J 7 c:Yom@ 14VH(,@6{$HMwآbQC(]e֜G41%‘FӖ dUt4ҕjI#J2a1'8;্˜XP p΀4D. iՠ(}ڄLkptqX1G@^BB&>y _V 5ȤuՅvTl>XMfX P? "|qgE3#md-ŀb$S*R v;j˴L?%di"YrBUF+pm e,{Х)G;ɺ'9 (T—%|̡cLBP=n;x  ua,C)y*a3m>%0ZuIBvp ԁA0]Zx _3@[&˽e;VAa:(%Zk?m,4Id(J+12o xP*ZE"78*8\Ja, 'a!dE e#@ DUb|؀?Hc%td10Mh%J"-A*5.-z!u4fPNJw a:J؀$`>WAK0A7lmRi6}%@2@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *Pw@0o7J @a?%WJkY %*B^@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *"@08G @3PJX *$I,+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@_HJC?%gE Dz+`@_HX J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@@o߹ޫýՔz=no76Կy.vPV7͇@=裰f3_}Za5^u"Y)7z6mlg9aٞԇa{yu e=]=zK?,4岐ozytJʑ {`-dg75eŒ̐'l+D J*rHGY~V',3Ak -ڊD ߣ-޵6*m.H~h'_pLǐ ? ;~4A/da^ܠ ̯aV/ }-GW0MZ>jX_(H y֮I>Yíu;odڋV}׫b^^[0m!1-{.-ts/^N5kpwt^"}l%2.a\ g8RG"΍8Ak32'D;P cgm= m -:YʫP0ϷTgiY"gA?_f1??7mեч>[ra0Z0־'oZPK?-Cf[dbx|=%3BɍV TZ!Ϯ϶#Q){v/0˳ݱ_EKUH39E]E o7 חpyyB=Eǟm${1OK`!_ڛѩ%Vl||0vok"υ=VB_{rڤ}D5–ruq]|?1m<74m\*)$Ѫ˺e𯿵uZoðL7 :x6@h%KyĞ];Fw曗ÏtX]pTv䮷!-$L8ڟWvx*n_|7@o[^UݶsׯpJge ]6KC;^2(ó:FKO4OM?5~Jjs.MZyzQ@;Rj"qN誾Iܾ2a~[5;*Ƈh<[Nun?)u~^bxk|d޵q$B8;R/aM$AfcdHJc?UrD7m~nr|ݣ?a͆b}ᦞenH-xg Xw1o[39^!n ƹl3)t:- 6=4 ~\"L~(RAՀh -z"HEs hp<T̳ -c1K^\fT<HjY,CjX)U+0T|UKz8$*8S,iS1ftʧҞ6#,TԊTk%f_j5 FSkg> Vw1,%Ո/6o ҫn:kwP@P%S?fPaRi;n˂ab m* ,ASo<~rXA"ޖ<#e9;H:τ NTYV,c%Rif(("W .֜Dٟ @8F)K B#5\c\3es3*|a68PB9}r.%J6I|H7myS7Lo|@Q s=,9'X:B~Q 1" a 'n*܆OP&ҥ.y 2Bl)0UfMێ~-l ͥzmkkwvƻ`]2nNAbATaebnw:1RJ:1"@79IV&g "d@%h$TO%H $w&AP i2leP*#&}clG̦zD##vqs@ЈRaMƐzƵ^P-@"i$,B`=RJ# #fG -iQW:iɱ~f_ϝFxcu! "XSȑX><X //fӎc!8 [kN^w1xk_DWNYga"EYK3~OE0i\v~湄v@/ -a/skY֔&`݌YYŮOc+0161O+xZ߶y۪WϯfRVL}9 X4]ED`_l'|'t{%g'FJUMP*& )9F=YMdȘDD+ 1gΈql{H; 42%^@`LCb_"`N[f[­Rii !',u!|\MBBX1D(W;mTkyP[_#xUK$0, )MMN\ ~2ٍrv67l *Rb T+| >Q;h͹cI'N9TR(-[NΨ(1F,yQ, kFΫWOn[u@}|)H!9WzJ/+C5֑}}ycyZQBSҰiO(& A;bz{ 1|u5&̻~}%|A fc'oZ.㛴8tpq_z޵`!μk#@ů&gE Ds2On^7;Pҋ\?5:yڦPV" F*::bk/*\mo\B] ʯ{|EGzAOBq(+oYfKP{כ͇]Ff{ORr.)U1/"$W֜AVTiF_Tjѫs=*%Z䣽Q}UYYQ'̕6fISB`l;@/0/Ku"7+r$^* w6УdYyҚ fnֶell20Fv>n{?ʥ>߼qW+՞h_SՉ;];u!r&p (*S^O t]Em #c$"8(1=(F$!rR(tޤeY=jNpǥ+LZfL]ͧǿrhӑ{|bl?uńvN_&:FןF9t9f$3wF0r|}̫'j̼6r>~Ż7uVD5y0j/63)NzT,UsqMOjrNf󘱻ƺI7ȭ͟(6clψ\ܐ Ё҂1 YH5@pSsu]+6G9 .D? H/[|O0yW_KkTw#NuOC{p !f9 3bÑhD l]LY8 ù Q\hQw}3rQjSz܃ `iq<[ۏLZ.Ϛc;lTԒ{ <CNF\&`(h0XyFRyx˕ws O{RzWR<'[OAXq 1cDz 3[yNɻ;4J#K+/xn<ˏxSF)̴` yd=Z8I1(D+GO3̮|zj]Oy^A"h#(KR@K  D  `"0ce5Qj 0Hz~Թ]=dzz#sZ*!Abr`ʋhFcfpްlVBN}iТf!CB=BfK9x2j)*W)+K!p+4a+M#84Ռ(L.QoF߹cHP= Tܸ4'BV?תK^sbTX(U-t`TB9k f)DɃZ}h6r37^o96_ q))hry?חsj] W״ZV 1nmΝXbx6/ Y"}u=}˯nC}H`m}?-ǃ _Bdf['(d9ךόӈj01G76_fBJ{ Du> z~cj{>)l^=n^_ &)1:#X3Mf1&S/^/% uy/*5xS҄ҿï Ds3=fK g2JNpu=@4>O,Zj{_b4UߔRg B6K|$T^};E;<G.;^2scVN,BAgY꥙DWP%IjE/57QQҵd0qR6LjRhаi1Ńu'n~E"Mug7 %[z5c1KHD 昐jN 6L[D_^Sӿ  iw2 yp.$KpvTJL2,E,qd Vy^@/ih[M7j[Ym>1#.ljs_('uOT_SB& |c#h{OR)уwRoVցr\mx49BpU(ߤ3jQrsis+\^ݿˡ^+k)L@yM>jb wQIHQ$6(4S[d ΡV[qC=}n_.;Q-[Y{뉌.Q˫]lJZzd1AI35,: 0Ϛ[Tgv͢;V|%F-uR% Q ,[i|(Zs13DmJA[]@[Fӎ^Ɂus7+:t,([Âڧe5T!tHa6+Gihڳ|I..ߧ[iKwzi/*bbCB; U`%L{Q5'x R[Mrw.l5 lD!Ǭ33ڇb[)d6YT'%с1ziIlYI812r  8&u"+ fh%c-gPZe-Z?,m"[k8Da9qrVEgdpI@aHuB'Iܢ0{)_( ?M m[2x9T!֢b왔%$jUZq6%7mbI .AioeЗ-i1EDl8(._vYFyD|nSYXd1`4J`Ph!1Ƒm+t} r&z+WE>UY.XqYAlDBKaC2P7+!xBtѠ3T ݋G=XWp~l+t z>- k閦 ģ@,`' 61J2W=dyE`k0↫K6DxE Ga|!※] @ْ"%9$E2jEBI)K& %#X1KP7{!Je  8~"#KhHJ:PBF JkaHM`:9_3-Wi"G4*frb"8ȨI9b%eItJ H.HQBK15EC}m,) 2E9@4XH!2H"Г PE Wq_*-CP4d]06aQe p)F(<͓*pZ6#OԳb鱖M)"4$Ɯ9dd&JNhe$y Vf|/Hui §葅xҠ3q% (hCJ@ʔ3aV^ 6Q!u=P&S~0 ߝ<-s&@x[&͝ ˆw+$p6h 0.5/?BGA>L/k|lX^i.$2:y vyQ}t:()- -g\2w˸S< \i5C[9}d ͲAȌśtGn6gm=u'L?G/~xv9J4ϗ-ꢃWkOu?M&7Q`w v;VgP0Wz{0Mʛ‹⋗b4apGãJw9+b6c?v-6[:Y׌X ͬ[,_ƓVGQ,+d;#]O ɺJNzlR0x+o+؀d(Y̊Hdt>6HȺq ȚAkQFT٩j<񍄾nՇ\t-*Cw&t9tН夤?e9nr?5xd`\f24nO)g|&L`R7D*m:}z[HSۤ3"hQځ~} )掠G!YBgI184I1[ P/N-\\PFuu}r;2|ZpPncft_Y ?ъ {էi[-H^_M|n~O> Boɫ~3}Y~uWFţ: XpuB{%miV|x:+6mƅ5JFy  ja:X VQݛ'3[)Pz+\٩(H` xB ^' Fn &:ϙ BƔ1NUPR).T>J1T0l \ge1 F&sȹ u] Ru-Sҭ_*}Wxkoo?TK?sD@W߼牠qC4RFőxS d4:TN&% I =ysoW+޹7kwۖ# '̶> Ԅ󼉨g{9f G~lG~4GF2A , eJ1*"2 )2Q%.!-9lټ*r!G~|_l6#VO%^GO'nw.=:z"tdʲ}9b)BVA viMtTLVy9‘S[٧聲OmEK}&̄krFsV&fp $S-D5LB e/\F 0Z*-:y] <DvJ0LZ9@Ĺ^%de"Lxĕ\;rs`iߗuKqcCf#ގ>[bKx&^ *qB]p4OA `)L4q mO ~ 0Z`ֲ՜Ώƪ#(^3=RXU%5Dhfgz}_d`X:,g`:؞NE^螿ZXPdv:X&=򾕛~Bm`P|v.!E`>l< 50xˏyy&A~8e rGdY,8mM31& d6RNYA^=2yr\ϱ˱WOy-tZYgn6v26A\TpD0hϲPv%Uꁕݎnd|x=xY|z/>d:N F69cj@m q.GU2$&Ѥ_#ZԓFHyV/3׻w8O*e#&Yci}oT"IpIa{{ k+5 Թ_ ^U[kgL&?xhļ7Bjc%+VΉUjS+^>%$*yDRut(0vW +qOr/ b,V㓭]E%OvEtMMRצēց!]\;M+1.Ypk9[U+ߝԛZ A+m##'XϴZ$iW#ᚪQ[S!֦, Ճ^]c94$fFݚV qV taGQ I_,4Ӌi\g<?M>;A,bR u(%4߃d2hȦ3J2d)mnQFieb$ AHQԦ&jjAd2v̺,!8MeFyZܭdVL̝Ac[+kmk^,LfiBFkT+e<6'9aJuF5KawE#Vm5E{x{@u9^tR@"䓩!1kurXΘ6@w-mI 8Av~?|v^"Np9]m1HCIW=!M %J%NOzDƢMJS`)<UTThВfBs{v.G[ܭ#jz*guLָd_[֋Ӌ^Zo#Qy! )}?J@ QdoY넒L!PHӋЋǢ[}hY*l9!c5y7I?zY0:n .n[ҝ̤ݣՖ֣Wr)eփ_R RB@J-BZJTN8NrYQ#ЁGPv܁[ ^+v@Ƚhi?҉6ˈO^\T@’(50&8jrTK%W1<}yIĩ!rT*HM q.2n!A9?|LM&NOUDCi{H)$I P{bX@JT Dx:wV!4tzTohbE])r"1BIt102Jmn?N!&Z9iDiI|ADjJ=PJX¤9 )fl/|q]u^%nQ̛BQB'Ez7.rޛW?}7 ׃ LZ*[uzظa #Ypo9ah Φ/KJ׭[i R(iroLKS=L.jrQ -%'R(TB8l!LV"`r1%Fξg}6qqmټ'=@Nv >L=E˕ǂf˶v;8P!9p/}1.: %vW8DlRӐ"e~??@=[u=wypbUd NBn5!-^OK)rgūǘP# znyYO14~.4>}['0V۱ %$NX$ }-cmWFU<+f2(Z,XnB*D!x^;Oπ/x]3DW1CMQCL*ZbH_/IVqCF&fˑ3طhJɍ ڦ9)zLڽfr򵓩<6m'Sϯu6S/4SWmAVҪ!JBS\6Og}g{= )[k48tq9l/YO#K}m{k'a楑a<n%x]x~!+9NvD-=9k6 5mCu cז_+/hO.W:51^k)Z]T$.l>L{$ǜ BedӵWO} 94nԗ(S w̕ O"ƒj/O1^ N+dIœoe@ `wi<>Z muxxuTo^'\98^ۮ`1T+|V!8nrB [ÅMʍb-q-ᝡ hϞfZt=/y _283iŮ䞯}H䚵xpDˠ|ZI_IJ)m-q4sH j_,.M8B,XH{"1jE8J$d] P ptAKo@  E#qqd9fD~aI0u+? ާcvZ>;_]?h24Y VSEn,%aD'G1JRmx,Exg~-U:"G$3):5cw(T nȹ[g 9o m:_{Յo Y_oSLc7eHBG(6N69h&Rr})r B*ΥLϪ~*#zEZ|blWlJfخ+ {J~ .oz$e0Q;{I氪t}\8.E [XAK'3#"γ$+ ^bd:2kVRBDb+&* O;g8NlT=ynh!7J" ," CLDvhIj.YiV4OވT-qȚKQ$$@y5:Dq66AqkJz,>(@Y'IV>"uZl]G!,PԶewt۝ȖRJ< l)¬%sWo %Ex+B*-@p1*4bʩ uЩWLo=,'/ /ioůחͯ$E4RpJޗsKZFd W0? 7z]!gh~4N㷧z|>T\sx=X<֥f)#c9 97qY\fhR䧿9a2K,-M>"͇N%N>c\Npo/^^Ͽ4C~~sjn !A#}ؑQ`ۉ 8 b3~J4D؞ $ΰfW8ҠA9#D!?ٞ;>Իe#.q?btyhN{P߰ D]&u5JvueL[0/o [Wϯ\/!)9`TFɑ ʞ t<"Rg /SOTsfا]l2uh 57C9r8pg~&X4IA{V7fR{,X;>RG[-vԆ5VRgjRqo6SG!Dס76w>&P ;>}7#yxzgP $.Q)LS r/gt3vw=:6E={9l ݸg_pNl m lļ7lC0W&{$fC|15YZpƳMuPSP N4]F\ tXLI tFp&:GRqT@wxx ƳZR BeǕLi u&A Qz}:qR9Zot9gMN>^^W9lN^17I]#%| #R%//jUs=fϧ)F@^C t;T&'Oz XR˧|SB'Qr)ڹ(Vrվ]|rڿ#NfS٥z΄y$FP)ԃuu]/#_z3W OAkJ#*\iG"ٖj;Ҳ,EߍX;u\GE!0V@ T G$"\RJ )ZiWqS[s޸.;!"="q Ze#"j#xi4x#V9j7K5qY2iRc"gVq㉝θPܡb]l/-6zCݝHF2`IM-Nz?0џev%dp{7{ao<{< c?'a Lu>ɓ{=bWi\TBH@  }s\_Z"Ay:U%g{꟟_3[}*2z;jtDX MզzMٕ_굜*fRw&Ts Q\ÃTWEo?{ Zce5q-;h*A#"T'r™*1Q*3UޯSE6A ӣNBe\ q7XqGs1+ŷ Jn#!rTjKP"-DT윮ҧtN|NWrp.GKqZJuDaGEхӕ(J胴Y[&'u<u"3)8Rq }>ȹuh~kJeB=e91r㫹YQ[L2|)Kϗz2ʛbұ\Lq2@Isi#K`pJEx,`\t[Lr[Wwrv<$QҺלut$^?@z6ܩ;^%r8 >ȑjPI a:*e޳$4tJK"'v5rV-嬖b6O,6"֊gJ3!%A7;D A Rh8YȔ?)-#?. 0RʣL\8ZjI'K8$DAV1'6o펨^μdd҂j"I6uMn ,)"U5x돨Y@VTN@ib!:P2E'rdBōqFDn{DˋX2%_1UDaGnc֦w\&B: +|Lmr(P@$kT4"8Cģx,XGWmbt!G'8ogoT-@j $r|< G}xTA%ɯ9Q6r @I&uTQl.l*k)QR4E4[0J('(9 Tgq`ٵO-IqzBU6 vr 7\ 48QN0*?.λ 1R9zq>5EGGG |867͆qi[`K+Nq PP{KO47=Mʛ‹샗Mpv 113g6$hß/'uvx8|>Bv&tn~Pkiʇ G7bGE'N赳2yCuͳ "dCsu8<@qtQ'J^Q{^Mb8zày";w_|uzۛ?S_Û $z֑̽I^7k_omeB׫ϗWϭ7gR+>\+r4 d2 0 Rw6MF{~6K| 5lJ\l1%yͼ?΋5Pb7P%>FZ_d'~[Nebq0!IOL]4K:C7خdءGBG*_EVe|quX88ikUAhui\יqOIwoQ;Qh)!&5D@Bc4D fbDUP \_RN8Y&Yfkax:"zuh$I#AT"#g 8mB;-b)|l<ϊ߰qyIOy_&a=|h76.Afa ë%:8N]' F:1]x;^c?:6`֕2A:g|ʩx皵^o OjߍF罧{ӳ^#/қd=yѓmߟ@neW^!<+$Z"-xr0mt:Xn ў9w6v~iJM\ksvε۹v;ޠ2˵X\ksvε۹v;n\-ū\Q^ڲ'Ԇ)zFܫYB+e26jR";Wkѫ$3j&jPFC̹Z ƐHm+Ihc"13j98o3q4e`T<5dnR#gܵ*A߰t >xAH.V28).([lT^i%G@O&DJX+m6h*r%$HBD* y\k| Z$9[>KE.l/ZU씠JWhyk2Q,z7;n2$7^-eZ|| Q3fN `3< DZ)9pd|i,Ya #)V)&C┡>;THƱ`c)!1/=M(5c1r֌J1]X3v҅HI]Ӆ҅ M.32^lz1pMw(_?$ ^&\MqO-PTd,۔h([0k_C6l`;8jK%NLP m;Mb`"%.Fۍab]6-i; R?{ƎdJX n|?.1f&k"k"KZ,oeKli vd5E٬bUAH$ViZ[P>'e^R#(\t3$4Wd!2䊬hɒI8DddaɐjȮi:5KɠG7W#55bfܔP \(Md#5Yd|2uS9 a"8i2}6hEτЩdV&HPV-dץ:,ѣz.\lezQz׋V|kL`$&g'M9a!N˛,GQj䓩3a%Wzxx,ձ>T=3@HQll6t;1;?=qF9G?>SVw3շӟA(Vk߮A Cm-&Ej H8Ԍ0g9$my4.;Gī+"Ti_e,`% :j "ezsRuٿV7y)q8zPgK–yɚ/ Pzv"`hv*ļ]QٔYQ0wRHa4mZRdPPt,ĝ a!^9Gսa.q SF鍮wByTEN[. -LڙmO|x6O6_H6:\m2&jװv1y!06[Gj-iRƮ;nϱlv2<3:r6*!VM$M`I 3 ̉HbT;EȲ2+=&@ac)tH i:i0hv6xdɄ.Eֆ$AY(䓅-<6̫K5d+y^SYU/ .R7g}U,STMa4X??Ux2~Mc?/)2OtQz޴lROoy7e$oToN/ p1 j1VUJڞ{imxoIvreS~wHo;j'RKaaMBEiY_תٛJU\Wᶍ>ɚϊ1~Om*=Qc-:v\5o'GQqB/U+LQ\eMWޮ%oQd/?IgU  OIG-^KI閕 -Τ^>_h<& hAj&2fMΥ0a})YWW7wu9LI"nc&),ē6<ז\5gv5OPLgZ-#.Ű6c5YKRxKZ_NH@c??^OW#{\kW ʒL,<\]e.gvF\VFzr(O8'iEIfn5HSҳR%Ȅ| lbC*㼑m;%I IRA@Wd%- Q9#ɇ<^Ou; &ںeBDp–HYH,g3@"EdZ^w@&[څQ7_ \Ƅ4t.zz1UgZQFȴ6*Yk<@{`^ƎH&d.Ȁm-vt`獲`9qG4l9E]l7F)N.]h5's"ZIko9:y@9)]O4`zV]mUgk)b՜nNXtt9i]Nsbѣ&yv,LѫalzHp\~ҨR?^My[UIn}>Ϟ_GFbZ:[orF{5 +qF-nf#ㄡ%\[.[lٹ0lJ{H:\lHͦҚ0F? iIZh{_ h)M5U'䤙b*&giX9ݫ3tryq1})2bRͬ2΢(je5il &Viǣ?(hM5 X6"v|Ru|9k_pL22а9+5@7(nf=d`1v6p#e$kmF8(|tf.#iRlzT?h2RU f%cK֡5 7/5ۻ.~r$n`.F*ۚZԺҺ=MyeA>mԲªwmz|΋=j}=L--nww.k~'Fw@dzVA;yn\a*OL^m'/=,[?Li{&OhΠɦ P<j)V"(k B }AwЂOACY}?X13JF3! \ʺO.oML RXHN:-KJ14A"0S:^:ƌ .sΡFCv];cO;҄FT\X")8/MZRdI*xJCˑ™>|`۾<-Ai!,J ҂ l։ $.Ѥ -RrS ^>RXxa:[ {,+?t s+I$گ[3csYPN־[_M}ȸQ2P;f\A gҫ¢}O-?^]96$6оDp:*%Ҝ yKzj8J*W漣?JO]Eyrs* S67t4܁,%FLtI*F (-\/#{Dx5.U\o7y:ʍZ>볂bW޾y&;Y)>gG%bYbЪoJMH6Xy7 `d氈)8Ҿ[8} qX# (R )YÌHcEV.3Eɸ7%/ghu`%rѣ_5Lx1\qۛhj0ƺOى{w69~: tDx!eLBk"g犋BbEA!+$B e |:XLrO,2){Š(8 BY/4wYC)ĜlcM|wRk.ԀC,klޗQF[lܗRZY`GXc]7t6][w1E9Zoi ̋Jql) |E@irx@OM>}Ru}9iH6$g>.IɗTBXPrj- ȉ"! }±(0,I3fHS)fA{$*KT@-B)Qo= W4+ R(lKů.*o}h3'8e=s=|&i!௖]ƒH c;&Ips}X-rO'{yٖ>>b l>s!?7E2{ ɟQC*+#L#2Lg<`֛ hRJZ-ρ񿇈m7crB>=GkX] |iFN<`Sof*;~mR|'OOLƶ( 21+Ʌ.R .TtizpP|W ϠTJGEk# Ȝs:%-d1FW:x )3 LjI(k>s~yW !y<ہatj0&*Ʌc@?T?lz4O~h|_џiBMnbQ{;V~zHyy ro$ ,_Ya5Sk9ra!p՝-yi*mŠ˪MN{׷R{[b8E`;XO #N=Rm"iUPΫg;$Z#;f0>GNi<<U_mqǗHy=HubLq|0ڂ^4r{fJ/6pc PAnؚ[ݡl-a=f0D.g/oH]w-+2m8hmP|7 MNO5Jp H $RBNK) yx_  J[p4D4E4 O JE2;eJ[9U4A̡P;L/cO8NJ?^9m 7*r|gk"]oQ/{^׼׼W ]Vؼ$te,':O:cnys>Մ:ovΠHP2W77?{wH1CQh\ ũ3a:O7nzoJ6M9D9MFoGЧ%T~7=˧;ۼ d:+?:/޾Ttd}NNg+Dt45Duj~?[mP/>ɵSH)"y(L3RkFϤ wArOJ+ I!:2)uV.O>_<\{5jE]ɾ^ϒ~r}Ծ]rxvTh#a dcSF nA284uh$3xE=?CRڃM06 beWUQw%I/@2Zd F+nUp>9b3m =*ܗ[²l *Nt:щs3*^Фb=j"%%`LNv3R-Խ9ql嘥NrB"@F{j#"j#xi4x#V9j7K5qY4iRc"glе50V^l߮oٻM50 &Q%>ZJF^DH1Ј-="t|)7z/lK|ekB]6Pi%"8I5.vAy&( IJFE#3yG3ePGk-uȖC~N8 Vo{u.al2K7ģD>i&lRd&͖| &ց?[ @I&R\PlNi k)+QR:y8<~vB;l 7LD7YҁHQrg2 \K>6rh4!֐ ws"y.4)c8 (\8xT4jC(Cd(dmkmaH~u:8?^-K=) ;9J Za0FMj-O>:M. !4Hq<(AQ/:>(+R|5ܠI\gJe`@i4 @xaH"! pwqc?Vn>\Hp/ bK^Qŝgq2&] += p@A hڄ|D{Gctr)rD&j! p9&D!KHc6xK  iA,jU9yH S |~Lf.H*3 Lg3_CjbtyG2Er20>,69NY3)"w Q&I>}87ץWzm+ybx0nǓo\,O:10j=(k#8:R%ހYQݺS55tSRyUx6|XTήF=~S{ZYn+ AD4e^|b)rH ڑ@\?źaa@fU\?(y`^,e݋ vTFnQU!8Ή ϏQJs:F:|LhgPֹŰSE Jĉߪκuwg/^:v׿:{v2} $z G@`fg~ ;pe迾iWT9VV}4^v>'zlΤJW|>3r4պY<:!Gܻҕv_wvAt[ ,װ^(^h95:. 9Pvj@K%viY[B~v( IzeBE59KJl2#`i08I_0=qӞҲ$va)IPт:@;Z7FSS7ZIihuZ9٫p+ }tчLz-fY:7 ѹ\% ^4[ [o"Gab2n5) Ta92fء.5=Y޹" $'+PxS;a,00|0gmuuxGm-h ;A784,Wzt֛fpzen6DHV"EgӼ@ﵛnrz 8ېLzM^6yϵ;&WJ~לrXYu=bKQz)*Dn%Xp)u}jr7v;n_LGq7p˝5잡mu7(a'ɤJRk u$o3[Y8K3@2kÓ*œrث-czݕXnTb}ۊU<[]+e|OW7Th1MeҜM^/9&pV3i/^ R<©_f)Hnjk~[y"*rI8#(Jy#Nnd{x2-*ɍ ۈyj&bvB;Orl5ghj]u{Z 2\CK$>(_,~K&VokX[ƷgSͦcGTlv̚a\#Ue)Yij֢td ,X5t a;yWqm?4_e3@U QF/wD@'-LJAZ\ʁA:2]:Xi65hlD\ ̑EUWfԖ5#%h~GBY'o& JUu2R" B՚vmQcspPZiۓ^kyoq/ncirQEn"磊1͋B$% !`BDE7fͰ,͵ olK7!+J$W{HV*U4(yCs ~XQ|QE * <@E&rZ5ȼ P>xmBV58 e8lw8@= KH@Ge@YLZ[]xiϿm@}W7ΏdAX? y E*ΊfmZA]֊N"mpXs)JD>"iu 2XkxhDPHc ޣ.O9gڃ^Z@mD%|uj 9ttIճE(vo05K r[O uIBwp 䁀:t kT([|`C"m!Q9MH={Nw8=_ (DhP]w؊D4cҠNn B/":|WtF-LB)* j#XQTbw ;ցi,+~(]ڈJ5(Z58)pcMDnfHkԬV)K;e2 d2 D ᪥hs:'/[i%Vh@V~O X)xn e#eZȝav^qWӓ|4vƕ6S]?RC:<3T-d'72aBSX& 4<&*ilG'U?wMplnN90 {rʫ e;VfA+!1p>* RuL'&E71J`B_12٢9uW[][@[]s͏w0-\g|nS!JogGUoO:_ {ᡭ<;:y立]nq!#td Dx:WCDW|D8}z:?ͮGm/{%*g!{Dl7Gm~p(A֨DU+Ĺ?Hvo~/ОY1Env[*n[z޿;//w4=pzR.'ivi3ۣ(V6bBРX"Orj":]AbD$a.66-|hd\UUForSH.b%RdZ[M %ҥVGv2frl;R. ]C#;,B\ac!W)Mxo*1H7>d <9ۯw^h`d>r|{ϣ[6I{>GG>u\ qMgD6'^Ffg3lB|dmdo8skj $outS4(Evz.gKRd.tкߴ˞Ũz#̹\4B"ޮ҇4*Ư1yaMp;ՊHCkߌU |~+"=z[uIUM+{kw^[;Bwsb$$bDȗ7n1_eqz_aӽh\B "bS¤1F3v`Gsx+U^ɗCuxIj+7_}":xkNO-?rq&AkCڔ-Zi]QA4HBd6qgqzvTzcCZ$w/R҆;ެЊa:Z-+8> 2$CK"IFQz*3̠g9}ԧH[0Xz7= 43CqIt LQ\])|#+Ѣb`~ X/S+8ͨ{+i@H#GKc8栫墼jgxUȴ,WnMk6j+rԔi@QAjmIJ^Cz̿.>|^ ~yf_=e |>tч$ _hW'<1YRS.e1۝~VN.~z7|zN~Zˆ˟=UX>h_u;sf~LS&lb6-]v$`yߟ-n`c/?ol#,=_~M[._~ѩkϐ<C^ZmwJ=?tv {p>k˧?g}֞ۯNl> t9m{=]}5`=~:`gA>f&[߿Ⱦ۳nS8SsUz. ͳ%޵6r#ۿbM.6,ȇlr lb;QƖN[l=,?$kdj3AR,uWOAKX \{WZz_5~]M@#O?Q&TN_'+lϋbe)n웵I;wn_,9anc5J>}eQ] w6?xшbk@>ZC;/HZKgKCR·o7w~KK./ex8tYbmi1akT6^ˋ,z/zOΗMgOy}ӥ\\N~]wi ,RN8vyr-zZW`O ݌w1b]fs5q~!8 7[&Zf$C|+-4cj-W2yw}_4>pʞZAɕq%ƬѨȮ bi &4Dc"~l`O+|jנ p=fl(ǎtH]&kJGZ>6ྼiz\6MB14u>MOL߷M] M[(4Fe:G)CU(FLPB$:\z2[?9C#2k-ۇ0!g[lu'WHzy Y^*SFBS1j @Aoa?@kOY"32=ezJS"ݽv*\QQL$)[-@' Rƒ#5 ;Fh-?HXS].{w+lk->.VaRhrޠO6y)7Y#E:Ql jNE-W}vmoG} uA ]^Y0e374HF-:6wZщl&2cD|vhDT{EDTJ%+21ZZ;"[iBLj^KN)Ŷ&EJmДh)E>[^J+kh$0b0V)4tyU:ʵNx}xFt-b28gNʒR[} f iE2Iki4& G~aiFҸUb@(@)JLj8GZJJIEj5ZtHo8e۰R2Yd$)N+1&8UG umE*(h^$$"!$Fq jN|e+4)#jvj-J@"C(DP%-)ڙJhQ1g_xFep ȾƧ߲e(6g5a )*Svm5j@B}5uͣyX/nb4d_0|IY]XS8=Y0_=/KVXA=J$jɲ a,6yb&ΖC՟ P\K;9T2I߅ d@0&c'Q/_Jʞ|(ǽ!. &.dQIC8rsDM;?o|{}~up01JdZN'wv-AŠ6y>LWߠd-o 3mFniU''L{wźOOo7zv;lriqV)Ϻf]f*u--NY}?Lu{rT3'p Gsn^ jX'g>d:(M.޳:_ww?}o yU|(pI-?%g`~ㆩ>Sn&iԛv:JK~ܓSu֑s6Hi>ˏ{gs2(-xC\wayC-B vש_yüߟ[Ogw!~0с_pttK|P=>|q`Q$Laɫ uJ(*_7A]`>va\q4B R`Gɇ2h1+; 9 u N|&|e:-.Kw^t +ۗ,ʊ v_}~wU(4!:DH.tTF*MVxZix>hъs Po=`IYJ*$}YjI^h9dH8̹ҫR$д%!(x옝@xu9@g#\V9a*d#XF)!'5gw6ـN4쏛]~lYiF =Pu㈎^ٚ'Q,y6/$+Сr JNu9xҁ]SVm%с=DL25B{K({Zz"Fd U^J(m} gp.9\ht yS JƮnjkLx-eJcL / YH} ޾u]&y3nW7M{1: Uq:-t yin~\$ʐH;J#z=z{ +u&ɞ h SK\8, `-*,ƶ(]0vAŜBG,"go ±ru4կ1zm&Ύ[]glۃH|{ 9Ye"H8A%lN+25 :bdu&e90zYQ|"]52 O9`"x ZDg~w}a)]V~py3(fzW]T*x2`/P&ȮVsg:Omm7-GqlڏʱM,Ichcl c*V|.f4ْsQ8xp(z($GeW HCuI36i;f (p5  Wvu]jOױnfiK͟_ЅN'?=vƈE5%&X{翣*s9Md9h )TsؖUF9UoC BB(nS O|c;| a`Lc7iϧy- mzmha#S45Ӻ(Zۀ$ϠͫJN}=V0 2=0L ): 2#C@"˾&HFt,DT;GDA5dkf<6^Nt2RVG==0$dA&F4( !*sK1 lhˆa,rlLE_-RL19$C"G L1Ok&ď:_u RUcF8m_C4* >i6 ՛#GY9T%( #//6ӎ}!6{vՑQl̮_#wb6`_q^F9P7V?P#;'?kZx[?{cYS,Q bU_B6wHÀӅN0挲DŎi9BH tD!dӾD,*afMzAB݀q9TPLƋUeT=r k. .f !W(MM=S. Y7gBRwqỳoyRyPO-ӛuwz3Jx~@P- ur]gLѬ޵6r#a0 (`In7`,cؒ#vWl=,˒,-[cck4""Y ?Jg"SJ́ ߜVM1IEs X#V^nW"J=}Ö5sv9MQ p 2֝Y4J>l?jENЛw5͘ \[bW-qл; GkoITD)cvJ6F\OBXLNT5u`=ЖW!'TΛڇLX5)<6B%cflX؜M`VDDPm".cȌ4"ht 4t(f9d- HDʧ(x(5sZ_ɒY& 2fa 9If72d( E$ <؂ [/M5d+yV]G'U~iwʤPX:):-﫺?JF/jt>czΣ߫8F)W|7 7T 5R]o޼S,yza]=a5=ca«US0} Ѥ?*Aɯ?=f~nO׵]lU}y,[`J 0'?mNɰ $M/Uk' aY\OE9}>A' ?s5Zn2ۿ7k< k}!e bЯ? j|Eo2P zՏ䂎L#ө2WCoݪQIV9aCO]%ądF"HeZ)46-cN>jD-5n h[QTne'fQb(]{u~`8ŭih^VCck%Ip%i>E^$ddnTT)g阵HHgBM.6 1H!x ƀNKl7KDSQ`<%i ?( E:Fdym QsG wyܭ.X4&敱zK S ("BXN3ZPyJ+W~A*^ J24$cC\%-b=aWeL 5Y]m=ӆ\΄sb#h3|:ln4 i9.)Hb팂^%'iK9K;EY6?37>.6oK'Y*z7Z;7s4dW3{-sݭO\UOmYVO>]f@:?VW'_\hI9Mou~n~k+T۽ι ZW75;0+#N[NõyZgu4zCigTAgTN)vh N NMWڪ?S J'Nr \TpTI$av/]ڤ/g(gJ+e0/E|H@1sžggq/qY?'=ANYӽ~<,֘Y7 қ>E Az+>9TCpe/[pVwN$-Z[hEB-b^"͜9bO"li?%ϴr 圷EP ;GɕEA2+^S"̓Cґyo OHH:Ghxx-cRά>Ϸ&>TZ cz?a)4h/LBGN9 5x@A}j4"0 !|МnV@/z@.SbR?._ ),ʞ̪x& +n=4d8371?,Nʜ鈳d~VI{4vmpAێ'dn$T:}{;/|g)|Zΰ]ruC֖UGvmFFE9nRlْ{sJTMKzRzs8;*ٜ fS!Xuj7=[5_2J5/fW7/w\y>" V~GœnaNj< phk5g͟p17<%Dt•جt+mϴ&bt1hxBKfHk1VEY+LѧJfN %B*%r"qT&3413@!(!䓎y_2L).he*OV$yjUY`&q1'5udL: M..@H,zOZu60Ag;i:ymm{M5s ] ݞ&thJ pE`˜ X"@SNՊFI +#UY<NwpZC{' Js;<@A[/&eySCm_+͚\a]:ȃta-,0ID똡~ݰ<-=Xβn֗? -,Qp2P[m  5gЧ{L:ՃV=dKs8"iM]Rl[h &z)!4Ks %/$T*q@a1q8+dDz?gK+ =H|S= ^:lb*g%8"M4ZGS2Qu"ֲsVw^R ?m̈́su_,tz0pg {#{go47gOhzĜbhѪ?%HZy8A0 sX,E8] ~P{tBHLd4 "Tq]q-J^tѨ༑ʎ Gb|Z˙ÑE|ػqMchW=1օ<}Rf+EA"@4 xv!eLB5@38p9E@Ȑ=ܮčyr>IdBo! scP'ʸ,x̹ۃW%o _b;br}}\EmFed @Dydy1xR2 ' 0} yM&ϋջ4yi<͠d {Tìyݟavvp]yȠv'4SԪ+BeF{i[b3׼]`*鋳r MjD595vzͻQVUj򪋶Ű?w~&CpӤ$SB׹aȷ♩eKgd)U;?ފ2RҸ=:eg.F}nCI̻|!oVR8g\f2P.{W>G1cņadD+)‹"Aw P2%1!Wmh1fa-!eՖVC&6t f|z&:F.6fO3 S8,j'sm>z j8w69RpI%'viZz4&L._e6n6vC&lo~x}RhvE6~Ί)^:9ma/w:kz.K/s bŦn/ A~wo3-ZP%WwK#n/<IZI`i_Z3h7vCRv[ݭ6qt6 sO9ى r Fa@>@~z$»oܹ)1hMMꞛa OvmiqDhfv2+ ֠E5(5xrΗ@3Oރ_U,MUF!qg"dDB"*dH4\FafL}q![ \߹^ u Nkd6(4S[| %):lGi|q/}c4vG6Z뙂.tl˴['jrjENqVg-?Ճly%Y-ûU&x7,[i|hM9"bh7b&0j_7m2l"vBQB;NHNh͜]C)PdXjq{DNwg<̗+*⽨!m^hCtA'mp&h뤂q< 1/=8Tώ }:3}(BFiGR>*Q~F" ziQYҔS`qȉ[(br3'l21EV6%cI2d~֚9[YW79Q oX!r6Cg"!*Z똜aC!6 cRmKӁzq?L-@ڢk}6>ˤT!&Ų왔$#8j jto] .r˴2kɗ5W4Ix)1Q8l7p2pvYk}.CV+"nPYr@)BHH㉐m y[{+"d0Յ‘^h_.PS-;6QMm֨c: ACʹٻ޶eW 8-}A>iHz$E?NdJ>A%EJ%VXId8D.w3)4NZȃNqeAn5b/݉G ڤ;+V:D!Ei&Vgs^C*& Q* O9hĸ̹fO`̱ڑJ'sovI%#~E#3\p@: caAPA,%"Ш4T ]oV`sF"jOm>.,O ռ0StPBWh0j<<\&OG͢e^[nZ(_u4^ Hw q"e"IE$s2%bB`i%&' Lܜ!d'ZD d@#ItKj10N:)&8(F԰aO{g޵.=Nf!U1 ͔&Kf4 5!jka6A[Af͏[b3وLX-:0Ac,#cdHoWoꇯlH-b/m *hғ=V$pf$ٴH+ZidË*da@dl.JP CUņNRb{L""vqCG>SA0Ҳ "aHozRP뙲 vp+'{aٽ(8QrTa53ђ/C;R$> Ť)y94>Y*'ػpLJf=6EGG\WavN "z6U/4J/60W's(UQٕ7iy VT;T7N'g  b`swYݳT{vax5TeA(0V$A-m-koɵ,CaS/|X ..mM/V jZ]qp:'4>? Ui +fi%p6c\b9j)|Gr:pܬ腦Cb )ܢ]E|5.L|swR! v0`6f[xutKNbiT|vFJ-eG z_.%i7(XW`>C\vI#L;U8*#9,H)+ 0IX qu:)x+q}fwwYz6ZUXwyҢsם:d?$I٧XtTW>C_amBM/{} !P0vY4 );HysQ 9: sc`*Wjypߞ U$!A}{_Wֹ-{^NmIK(P30B0%A]4BkH#YtL q;A+zg``˩[9$TVleܘ?7z!rfBN`z:C ͙;Ȓ݋hs(M` 2XkՁ=ؼ,2ܒ6 X"sjq# Za-puK-sOTr.q8>)cr]2+_71ո)P/66|өkkEW.M#\06*j=u@f h'BR.b;VCact*^4B]TW ߒb`3 2K\AcJm߂0!0VS"bƐnN$OΡL4J8'%!bi1g;Ja):lNJ3rnV vVkxp}4J@vյUJl&Izn>k`KּO!OR}ʯ\RizAq>B GJAnT# O7g8Yzk$7?h1䌖A(<^ ']ֵ@kso٠~Tp utp!`=/ dbNM}\+(=aSo`Xx[cjzx~]6':Uom=#- p<9R5'N _jSUq" 0i O( eZ$.zed3ln,:]e|f:X]vإS0bmJKC]Z= =4-d~wMrL>S3M;pqf.znhf.v(V7yiqâf 3I规M454+u3{C=CePNxg5&[lʪd}円k`UŝyەDDŝmwf=Lzy wn^\XU%}-Ts&#UdYB R0DŽVsJ~rǜTZ߀f`4j 3\H%쨔HeE,qd ^uxTxc0WC}zO;!.y]G#KR:M}p0 } /\*كv9秓E`hi8(ƘkT V0b?]ED-;jlQ 5"O{xR8v+C!&B"2?09pW_%Cm'96g&x ~#m)WSלI0B89#MJYue%Bda,0dTI"^ kSQfnEْ{%-CDǃp ZXpkHrCa*-"ʬ"hX)QF; ZDl `WgЫuH IhNPƎ!bgDcw a`ۅYJ&p{O0/,6?7+{=rvvIo֠ثf 1{0xC%Kv+=>,Ε&GB3H0@:; 4;J݉[--Vp c#8 mZK?#2zb)#Z0\6E !Ie:Fj*)")@ B\bl)[ L860FE}&ٌg_ZSey{ @Ҥ.;8!3ˋ>*,ebᨹף Ǔ LǏDze~FNpP9]Pe$JZ;h>ra›y3'jX=7#Ƌӻ4a4mK'ymJȸF81㊂?8! Kz:YAq 5SeƲqZ O@EN(*a$+߅^U$0E&C.$%U""?; d?3_+T3\#?(m7 mM wUGp}U aK-FN1okL|V8+(;&!{#;=S.{9gk]XN@XTtSՄ DdK6뻓ٖIs#X8 +E߅K\WImqVRjw" O]<)lbSc'aj Bhx428.ʳb0dw@oUzʉloh u2T(1ܥ NG H6wKL!Exd)L.qު t\RWhwzJ1!&*3- $SJC2F`spW %=v:vD:6"w1}Oa'sV9p\ؽҁ]qcߝ hao=fe.jdh/|VY$рdFAn'[jJYej ^ ݪR 괼ƃMQ?tkOǨf/f~Aw1,?nӠ7qe%S[[m>׽`x;XP,?8\]ge[Z[Vzظa-\SOHr-8-KJƭTăA*2EégQsuq}4<@ls? Jo@`[XJ›:0څ墫%,4e1BthLpS1!+2"XƵM? u88k { 40,JB1ǂfӶv;JPOK!:ԻO5.NhK`|WTv…$ ۦ.|c]ew."(5+EE,<0G2)wI;'x PV{:k'H\fIM>2Dt]Ii'E΢p \쌏> $472aI 0$搬%vx{U{ycuV[%EY..vvq3w2B*hOT=rB7yq*a!..=:PImqKqGn~ܱ 7O($W~4dx?蠾Բԍӫwh O7J^!2ؐ!8RNt #EIe%Bkǝ%B5cJKDT˘Hf"t^I,) FǑ0Q@Ǥ-_%@xBI PbyR}L.\X-P&йled]M!Gޅj49ߟc0j;/)f|)Tݼ`*P=-mTx4V*ԹW. Hk"SH#|Ȟ ߜt 5:6y~) Z=^%ob=f]_wFz=_U2rXnŻnP*3^|1M'ak=Һ:À0'<=5lt_]OOٲGi9{, MouEy3例"/g>niЏӏZ!oFiYy4SGt^|-/=}߷靾'-h!*ϛ~wIH/{;'ΪT49؀lYW>٢Q|VI_I{ot׉+]y)~܌>+ȹoil]pLT_vNa]SIܙzLV sC 5-I-o,=l]#/y9ܦQs dP'lhxsTG;.}z/0?KeQؤU̴@t^eoQ.!FKZڂl)h-;孢35.𳧙~|/yT7m ~JfbWeLJ3`Ȝ<9Z&\UywVmq$8BfH sr!TґRj3aUg?= dx_obZ})Oz,_ɳ;pws7m:Bmܖ]GQoPDmt.l:-gVWg#ӯ4Qf'jRÌP2Ȧh`A/7+r>Kǟcl/ٴLZ޽PК ł}umǯ@/S ~!tJt| n2);T zH!M{d4?gQ|B/NApۜ!B) @FPzrɴtnv]^][uM͞;%>ͧSc,Q%&^0Y;^C߽EIufU?"۠O\tj59ЬyH\]$U:8Lii;F;!2ó^mtmhy/[|RdfV|xVi|yR ^i}c'Mc&o; |G{Â=4ӏ0h@Ye61 l2 ୍E:Z幉I8M5q =4HT>RebC,dxa~6QnJOr"+<[7.ְ)֠k `׋ƒ Y<ݝ&"UQ` H4'kd6Aiq7`{@={R")aʚeRZEN_"L4eG59sɓ"#m{AI昙+1gR@epFx]orWr{kc"xhwŖ|l')w+ɒ#ٲd ؖw CFI<9˱DAԝ$ h%"8cWtV%L`S#S2fa9If MBZ(z@N{ < ʅR[m,MPiNh\2)Tu^ܜh7U=~oOM&U)GMԎ?ZuUU.+^^Ӌ&r_|N]/^yQ$yzq2,O23,)N= x{ q[/妊ɺG:wӧmmǷ/.bO.1W~%!%#e%yU+ҮbaO_R5V4TP "7iщdj鉐 RMޥ'P-nD>IÐ 2vrThLq8֧cjc:ݯ0y5h u),.@._nܢFm ?lByr'/-bkTEjp[d'=gR}"/qKƭNƅdKvt2*p2IGճkB9B df<;CVdln <~a6og[o۠}0>giJwa&v2^_-o1|۝( G aN>ISĢSގGSk#%Yv<תH.ZMή)cԛV& q)6,RqpJux[4('k~~?׿V)9{hOneH'^b9Ry3|Y6, vvTڍGE6٣֊^I8V1Cef 䅷fQH14&#f'RL.Aԃ tXRW}1!&* <Ӡ@,L) B[Gs`P^H( NK*ݷ/̤=ms5l&pxYiRƆhbr9ռyCW9-`g$ͺ6ݧkK3;FW1>Ubcus LoԤ_;oS'7v{wgpt3ܰXd_p:րlxF}И¼>6ěnCnb@-A=4; ) ir׉C!0W{A,%|M=:0Ɓ墫N%,Lr6BDeL@W\Hs +I\9wYYGW6<'|g|\CoAr^~(Hom(ozՂtc - l/ I5ۦ.|0e2;dL}o]d<Ú Td8re-B@O'ÑYos,ȹ&K,jyx.ٛ7g nꏸvg&sF0A%.G#hM& 8`Yd2Fxq9d]H*[I/b")yHd JZ \sx+qVt:nV] ύcOt6NVoٚeTyq׻S?'|w BǦ %+VxqNdQL$ |7Qoxkkc\Ip>S9)j%L۵eL-cwX3[6YQ:z[/[czUzڋ["|)o Qs<Ŏ!JA:-轗LFy!$Y29-N-2uCq0 &R) n\4|;f]8MeFcݙ8[l?%`lWcW;ڼ=1--r(d JeiR9)EΕ B\tj3$ *!2@^tɐ)穴H’^rIN$ TBv3qS_NF j88"--bo3 s$R{LQ<tRqO"擫|drHܴI6\F";NlI 0 PܺJ-YG.NIiQ]E.ϝEFGZTHIGNX&Q @Oʈ@0zx(ѱ={0af"яh4ݧڐlvn=SmE5@?^+g6(c5RI9-֌0'M\v $S)r'YDZz *Jǒ2l؈2蘴cW ^FP(kyR0u\8-DB+,bfcdݙ8[B=rSjw_?0%aK <V*Z ]@D,!{\4sf:Ubn Kb, X,(X!xw1Ri s))2 :VyXkƍuh+sю%LY7RΞ WB~Vo:bmpXW}qo}g]KD54>t5f!䚬~AX[eBfBx5ɷ5Nd&,E"2B6& \ &=C,L&ٔݼ)ִg>=aIԐ IfŜyvByc&΃O,'* M,h/2ST:t2Kh%?20jgmvzFҺZ"gmw}g뛧נbP) gP"&ǁ ,pifN588N4ݖkc$ heH 1B%R&L5x- x<Mŵ/V-Na)cٛ޳|&R)){NFsoud-8$mVъ(Z{cx adsd`r!CcY\Ȉ9NΣh(9jcH֙}ґ!!u 'zb @+ΑyC$9gfPeLڙI|kB!PvaG1[i} IU2i MϣM:I5VWdTasWX czK]eFYT }jf)@6 ]]YRo&zC`Ə&餚uK Ъ[/CAcj2!0 !|PZS5I@z@Jש0jp)#U^D7oP ]ʲ$2[ʂ阳;2\Glu^\8(e]MWn;mspMqu?jvO)ԣ|G׽DЇix&§8$7W-L֖UwGv+佢MFM|v}ŭR;K6oc[[}Kk*ݴWO9("rz.ks̃#|~_m45N#&$8c1񐲳 ӡ AD, FkvW*Ņ t"Jys% ,C*]Y{Chd:η3q% ]X$R{.k.O /1A % 6\SC炘N0KWUF?zxOjשfhTX M5"O0YlVm4t|y?]Ei>0ƀOiސYь է>}D~0˔mIܓMiS 練2&n,jȗT4_45/TuQ^5v6m~=6Ejv }gz5NkQ럫vqjeQ2\rXx/.ŃuI wsFAȩTCC^4m$/G{_f CkW a쫭,ODE[VXI %ݙ,G;#sJ10ƆT#j6!.Fve"eJĔN8,r"TԻ>F \Jze}r(4lԨRQcDF)Ѭh6vG aYZI~0=/f.Fa:iSӪb[G!IF9H9RO90+|HBǤ/P49ҟzsRդOw,IPp>>Ayݽw%I)cXڈ,pTU I*.dʑ6DbL;7 =HjExey!ƽa 8|^T@v%OQh-b ~BP# rkYM3r`g>A$ϷGerU?4hS+ m`d>CP\c1BzITI^G&*4 O7g|5=i5=,:yJS CCȔ!&FYtM(,gؓm2Ol9>y&Y)qO A& {xñ^;=TGJ_):hmG!z(ڳmdGx #xqmxaq37X3ZD $|&1iV\wAM2mŌ3xIBT,c`rυgRDLCKŧo|(xGGhQJ2:4izW؏ǧru^L.dbtv2g}_>F`|uvn0 Ksu]hNG4uW )k?uCo6Uſ.٘.觋2Eˋ^q.| N6fSۋw㟞oy/.9~$Wb0^H_f89N QȒVe㸚 χ!Spֿxvf`!=nXxppZ ].hrg-Ň4櫧כә?$I@0_M4op7FYE5~* [*BUFWxB6R+jh@Qu8[p~^X|湌=742~N Ǝ+ּ+6Q"q)x:g;NM*gAnTfP|mgA4"җDϴs9ꯓh:/fWy+=w,/rg׈[.UYMTRE+R(Z%(cSWY',9Dd=xOؗU:$IK+e+5J<ɦ AE6fSYQ@Q,TEŗ?|n:?SQ][Q58PufF"cw;l#oĮ%ߎ|[lӄV] 鸊/W1ы-UxeD[a; &l#)$ۄWI\|:mB[s2s򬢞hu4/6jԦӺ+R|p}[Ņ"CdN ;BK:)Ml65 lL+`ʘtK4'9"#_ YX,6ܓFi=pgiGfi׬NA[e?"ޒS(Lbz k5N[DnĮ͢rSwO<۔CBJF!8N"w!sARBNK) r&]N6'{?(؎BA:g ɋw<%hGC`"kOƧA*aAWhCM[ȅU4AL_%ЧrբXq;u\:[gn/.a,K7!.{[iin8J9Cl}I&#>VzluVwχ]0"2J`\4$ΐya>G?()>b;|OMb*7.IA>~-Q^!ʋ:w7Ohbuh.8iJb<$57:Eb ="͖=_ׁY?۔͸ .0\2ݻ0,!_tR_gfÓ@$I H`m´3oG#Gr_OKW|7jtwZj׉s3X /hR1@pII+"'<8h1[M7K벓/m"6.+Q +OaQ h'DX#RI9ܫ=db=.TwrkIoኾE ڌ+ǫ uN9]yutl:FBJ)\B4']=}8G/_GY8ss|"Nf@0qvT Bs:;֣")/R4 mԑh\y0Xv( p`عeOƒ:ebk8lWAh\v4R_&(/Or3<). mkeHQJj#GI("141Jc[0RR.zLs.xg١[LDI9d=H6ޢ_e 4$ GA9&'`騔!\zϒ@t1*-؛gUzձwjNti`VkE3Nd T!(L)RJhf#^:#S**Х PoaXq0%I-蜦F Re9R(H2>Y¹$!rJm:Z~:'Wm˼3/F3&Hys+ps$S>FFñWeħ]#VDl^Ȯu Jɼց,:*n#4"ur_6e>=LDvU>ѱա.xq8i%"8I5vvAyM.D haFgztK0VhGgvȎC~IX.R7&-NAgGKʎ|(D{#*%q&#Opεs&njF'Xb pm"B#WHV@aGiDC(Cd(2]Ү2iM?`Or!t|v2,k '-p *œaZ!|tԛ 4]B4Hq<޷PS16ސu\%1/k7:,:Of JJS7Z-* 5@?^0P} ,2G^kEC`+81R$#K!aţQz\|?5}iEyw BɥH hQLP` 5MP8&D^U]c6xK  iJ *p<Fٵ5?&9 R7 %t勱[Ÿ\kOPMJE Ne⴨JbW1ū*-VMY&ǯêņJPÍZ (ed{I~WQl䕟 NE~݁,C%vpIΔr`'x)84Z[K-l%optByt. k쐁?ΪBq4JjPyc+wbWkTuqiW^žW٢aTqv>'~7^ Z/_*6ùxu^,:)hqw1=/j fϛl31 `yr ٴ-܎];o-)4˭-E[3P4E8:`qewluGOVm.ӻ^NV%ZmyWAlINk _Lz6A땊|Hh1ֹeŐe`4\9߼>>}/g^ROg8{`=N m,0,A n5?4}{L_oe WIgb09u'U-Wu]AFنBGsʅ~ACtWPwװX(X,`v5[}u]V/BeB4"`@GEj~iK]B'G WtҾU-YWGx+i}/;Vn&Sb  n+W'%%ǯBp}dYk8/1͋&^ e_p0xR! kU*Nf6m`X7629 TsjzM5/qexޘ!JB2A ˭O`nt~U{U[Z} 3Xh& 7is9[R$cq R& aXgz76>OelSdFH. vDо\ Y J*ʒ%:V$G@4lȠ":Ū [ J[Ԡ5xd:F T*ȹ`8\ _nC4ljgAFV΋ zQw[tgp EIIA 5/Q\`-g f\(v"K+Ky.gFnIhx=2x* { |v-ϗ?KwQc]Mu6([J#rXy:rn+WQFmڃk>}{x>].;EEnyM+tp1?Yo\ç}'ӓOghPQZiT)KHR}-ZB`-RRt[ -yɖV  Db\RP*iu$N9KFt2m'Ǒ=V,Dz?>8/?]v`9s U9Ѹ\SFz@={QHSAS($sͰ&ddQl[A Pg"rE'=Q( 12Fb,2"d BL{1쌜ReH&C)J'&TpL"lX 5 iYKx*~[ oB/i< yLGWj}q=/="YȾfWM%bm{OSxڡu]gqj͖Πpۅ,F$u5Ԡʹq:CSJ žjj|)(AU-HAbU(RkS.Bu(X29|IrU ZkӾ45Ј2@DP lUP3weX$I#Is2y#DLKNc+C%luSOa Qv:lfPd"+ %4kIUgV(]B/,!,l9RW 4\XyfAPQk h l[ڠ ([4 Y%\ -!MW4l7O|2L%2 !ls3tه] بE[+rE`faKCq=? T= >Y`QDtVbdjlIE:ĔN KN)nOIm2ZDI) #\Bx,s^J[Ob0jY>%9cCrCʲۦ+S3?LQl-InKP3ѐ q6@'4!JJFzc$^r;܆Qʹ&g$Mla*kBGZ(JHH$C֪@yq]qCt,kY7I6YJQ@`g 6qJzS`ûe;@)B6RPK&i%>qKpȓ=KlDԅ A<89բQyF"xA@>_kmg9GpcY 7OrN [P+FMFJ5DQ&c#49W/1tbM:ZU&'ņ)֊Ke#%B6X4J9>UGͦƕ"G殰DIdkUbRd5[ Ē@If%SYؿ?iyh'-Sgc{Wb!`6bʰ(!`}zAVQXM6zr-S0\YZ&ȥ`T6>?^?yDmJI\ß_ZG Z_x1: 'b\S)U_[&y S*4%%8Cއſw +ץ.rG6yR*<fqnΪgƛ@u%[k$0YʐOٛmfrD&76q#͛|? Q чyhUrxeZE/_g5`4εRQڧ} zbSBhz|ZAfpcFփE^?[7Տ+ߵ,?xum x{ųgѸޞ,V^s0|I_GzHgۆQۇAu0xꨚ8ut2>Z՘|:G]Mrۨ[檅6άqԁ5Ϸi[Ebګ 'x(`vGZV0Ol4^\ _~77믏ϟ;RWuy o GHPI@^eV(}T ~qhHez'պ+EΦڨ"CyW0{g Wiq]-jA.Œ AS2YnUc0YNd s"_#&$; oZ#$LaCuJ(ނ^^r r5R.xqRG'RIlA&"dtbd;$~9hIu:ysP;_FƘI׀ yyyQx;%6~2_y{8/`-C.އ˜U6D*"nG^o(dp1nw%;wٮWd^"J0ʢuH9)l! 0l`hdg#\0gQZJ!}KenB먫EJo5?q[MnyGݼȭ{oo<~7oA4MnlJ^;XǒS׌-s{V-~;=v[ X[<-jF"m iIo<ZRyYNy]]Xt+0aSM5h{zPTu(k:)LZB0!&->HI:9LH8QAY1#ܩ @>PEh4AdjEQZ8-)k:vCuFnZ`kfd(uJ;'̘3(<\ڌ+uc{jlM>}|XφN#o oZqW'Z0 V<ƹuQ=Px@5<]{<h) ,Ks*d❨'FY޵6r#dsmQ|08mvsO1VFIǻe-i4-O:H25Ůf>~E#q6G))&u, D!礒Yن:14Ny}"8glOGG1_bJ09vRV h] *@6L&Wy&GP0vddrdsO^l~ B0{$ǠJ%0B::(P\SZBEi !&RpkYY`9( 94Ck ELrr&U s,찅8ہayߴdǫTqp;A&vK=[bs7Y%«jp<:{!Nr :H51BzINrkfh0Ф23OB=!׆-:+(A2&gҖ]m !  xL_9k`` @Jk!iIĸ0Zq!T>OrdAAwf>0WYRq玉%ع'绫TeTnnzzgA6"Dϴs0_Oi^p3Γ7 !zXqx}>ZO?yxXKŢ (A_?Bh^TK\R'q,i㖤OQί'-w-ɉE_9J-=mٵjqZWE6rQ=B3 9,>ݫ|e[hܰKp4j<喎Uʈ7 D{n{m] omWIso46-}kn}?΀J9kzϯJӇ>~z6+2 n3 {p}[Am3+i~%촲!qYf{EGbfEW.\S#Ô* 9eibU~T{6owq};jl'WGOJGBQ?n[oq#lM'xԜo\e:pǹѺܻAz{l06{{s4xR7aӰ3lvSy2&ztp&M=wnЬld!Jt{>Gnrg֨;x۾'@FN'8iN]F\ 9Zה8ƒdJ3h4n|kTx; h<X!0+;4o@ B`xzE@/@.ʾ\ln}qԗg_킼#5C)Y?Ob_)Z f7~ZWo9iS#zgLy >L6^"z|Vy|8̶ Gi6v\|2O lcsG{υ[W"B4/gnWM(4ϯs,=j5j.o7d6XLśYت)mk,KNdތ^.z6ᄆFdG^C tQ^8CH )Ńwz/ahÄSzdUKwfǪT+\_^2yǮز~=ƮN\`ryvy*EaR sLxʈa"mr1^&%HƧϠ nOE=#jú;때.F$ō8/{~VI&F ,@BʢZ:*O8<΋@g4{R>o1$%¨T z!1&m,XSg )'.JMjG#umG㩎c\k\ "8eEM t8$kp|4Ѧף.q˛CNB>Kҧ>󛙾=+>&0%N$㫯Wr׿vsm(_ q9MX,WVDTu @r 'XETž*"lPp H`B H@l >[BX=R5Br)ž]ІH'  ) Y]aVLݽ)G՛@սirC"./^R8]W*)uq+#f#_$\z2/_t< I&(("P%:(&pu$I*>a4걏o!xX^AނG|ۥoJ.U?TwWqPM]^ClNOq߮owٻiq0GfٟK6^.Sqp/QܵZN=ً¥VGtjRQWT)8B qAscU=oph~{ eN%R-,#T'vX9̩;Nup։vUGz XD"FLsD$4B&'}NY_ȳZgWwפ7xOfg1caza[ey;{' HCaGmÌZ~s6&?ȫԎYFs[@D;!J%Sۼl<Q4QScp@ :g(@,J&YíN*^[c$F o_oAK"\:e^Dʭ WZpQ3SI&8 JkEQ#A"(Y)D\k҉"ڗ(%vEکHK8'I{$أc*6Ozt|ߜS<%os& Zܚ5I9YL$Od1EQy5*h椐 X -VJ'4vVk"*#y[j '`B2N eARmVIPcK#c1q#c9R." (XCY6dfܒxv\Sivr?s֠{@FAH/saC0w h? @}ThS)r4x4"h(:k!{60ceRy7dP&&10DANUpVḶafǾM 6PcS R cZ5BY0$ms19AvcZ6IBdӂZ )yT5!/Ydp 1 @ HѨF81S20 """+C7x9Lp.ZFq09IrAR"p>[-Z|mz7o B{J 4)69]F"4p>9\~SEjr/YWf$+P+2C7{6m)0Q7 m(nlW]4Um7ͳ USu`λ-WA#c["uW=X%ucwQ=@Uu jy"'Vp1y |`SBThG բ,Փd2<9E 62 aBZ# xN`A Yj1qvL18h£1wh/RʥȉRVIJL "BQP҈T:S;PҞjVʧgN:~e7yQj_]P\Ճ?۷IUej_&xZEcw-~V]Gդ㨢 +￯VzdSIQ 59^|Ab^=[+c.F 5<}9fEbp LtMl6V?JbnfNiJAPMSc}k%V ~t;UNR)q˻ bהݛ~ԈI-cfVJ m~V%%vj~\4Sx\p椕y1[-f<p)7w№@8`7T]fYGH|} ~_:l8P=x!XĒ tw#]2<6t[l;_`>e䝠'l/OxEORE0FY?~W4f܊b犠yQQ38fm$Kg\ (`i_2"(+1#r2HC&!XeP'$k88c+ICo\R$ H}bA_% 6$!ϙPɢqV^~C͇ΩXȼ26Dn "Xaq!RYDǚ}viN=)@Eaz~MFAɘeLIѢ[ˤFf)O@9DV1EJ0[ c$c3d>d =]sz՜|\3y`gw<#ib;[.;@ORڭt7`QG2`ek^2.-upDc0 ٛ|DŇ[4ƽP]58Oikk[I7nwWwMTFz7WO9N]\U{8UɼczAZrֺ$8QGkbjqLt* `-#<,.V&ˬPKt|usBJ2]?}F,p9l,#kݥWY9.5V9.#rTrs5aJ[ծԻn(HҿJmfb:Og{HDwYNSh:gL_;$ٌ5ᘱ]kҽh߁7eٽWE2z`ٷX]dmE~V廓x#e1!jNk"B,"BVQ$MgBHDd@N{#@c iŘ@ 23^Vڙ77]Z`O-ؿ1V'0IeceڞctJڼ Q[gGY\oP%Dmv箩-V6b .t~)O=12UK>JĚƴUcC/b8Od̄OմZu/(s2R}>hX=bW?5& JWXN86b0aDVu+AH'Ťr&c!WQB N^qQ1SGUSYӸ=b)\y nr}( ;W=KϽrJ|JϮ/}5w]zhI&csAlo.RZ7/tNOtڵy/q>f!eݻ}lz|WI^wVr~f̻<)" nzOt&{3En]GHCǚᯛM"w|NO\?9x>'f}!OV.׾q&h7)&d)Zjk%-GܐA-d"ILZ 6X Q(MHWQl0IlR") .Q`xn\ƍF~u=M8cxLuZh"G'XtX!p/TN3e I&]9| C '-lwOA򼿠n8l@6. S6SOJq MPΔ<,4]AF̡!*E7@>T,XJ {nV-i|(ھs1~3DmcLڗ9hڒ aDv k\ҟ yڠ+:%O/LoRr =e`z6ʦW3\ШF[p&hkTApq< ctZsP&d_rvd1̌`w 죴ȢR>)Q::)Bp=m9Ai<|Ɖ[Y.qp¢1Qde%"u@`Ar9;Y#W}b01FLJjSxk Q+DNܦc.)2LlHuBfc(Y'Ea "Ч\}-R;/IhgQ29ΠBV[e'e̞hH"8՛fI_~;uYU2-d$J2$mGD&J);(^Ao3w(P#gQbD(!*A eĄ #Y Am%O-P7y𩞅o}eG[5 e1F$`aZF;^P"dTޙh78 Ggu;/yt螥CNy>|{l5&]PWp1$glrڧr*TnIusbWmwo~8r\p}Nvʍ욫kK]=57ui&""1w55lɡ(A 4B)|A8z%`aP&ӣ-rvĒ6L`UDp&L{/L rKʒء)ij9S~&ڊxrNڽ]tfyA(.箅i02RDzI A! KҠ"p+m [#o(OYXwa$p)BI@4Ƹ )46'o9{?T= zut21!#sz 6QH&IiI@ 3~~9.SO# Ǖt'@@RF)BH9 L ]Byq+c:;_C&UXSdFTI2R| f!_y|Xq_z\׻ΆI۴j's6Lj&Oּ8ŃS~@j1&Fltj489^|Ajb^=[k5C/rtZe2\tM20p8xOS0UwKߨ,~EGˣ_ծ ݹ$] ׽ot[ l,i8-gnLo9hպj$}lT?OrǸ?. K_b3[PLW !fK)CK<֭C J /IQb ;g/eN5~osG&ϵڬbPP-`k U cJ#lh`Z{1[,(l'T{|*. 8d+oCܖa"U@C@l0ɕ..]~7o@4e n $]K6Wfg{7ږ!~w4wm_3`~ojB[!viJ0Қ\]'+s3yt͆by sR 9j$G~ZezzXyu뜌TOқ BQsG$L2k)a簒[)m0P=Fm(9&r6ȹQjEPtcj*FbD h$=oCFPZhg6Th(H+I%<1lF:3[כlb>X`;Ĝ+~VRW#毄l6ׁ6} s&:XރmMr. h`'`lT#ܝX"szq# Za-)ֺȥ ù Q'*98p_ӲXV4g׻͊7׭| v9mm.Wo)j){Q9{w^/K~*{e ZrO1 裐A(hXy3F؊FHk^KR {FA=vBy)=+h \)- B$B̸Ԅ1_ŀSF)$ {s(S- I&XZYhXJ۳b荜Caks]9;% yw^'nyoIM͛VrБPCL+I9>j" jo6n7{\398~sRl2$3 /s8s˥(ka_ٛz0<ξ>=Ab0l?2|t ֛7/?~;t!AZzC <%,;V8>ڠoѸ 簤`.+-I”^pD=U98;KQ:ê>]ڞɮZ=P^UWO{mVjtb9yx9zo]kir]IYg%M0믯GWW[HbVik"avZ.{]]ri]+XyMpoѷz|TLI{zEwwv84;RLYii7nX!y׻zw2՚0Ss#d:3ާyfE ѹ|?T =eQXgf>A/~qJQC)eiy]O]yOP*Z9kW1f HRX F#_|z:pi P5Ox41ruWtj>z;Qo+w |ZD-}ן<]EnS.kɥ\{P,wEw}Zʄ2ۏ&BP5; fUTsc%bD^I>qvqF̃t4`j0;}ݛ0OdH*jG&a_se2v土_>JHc1F1 ɭ`)$do|.>Z>vM%&E"Or4{ HDIB_r1zݠZ'&g}2͒C'tT]ᒑvxgÏ gׯ6`e|.{tDHqb :eK8pUQ%s E(':2Kwv?xLkQԍh":|k20Ă\kG Lm 3[i!QfUQGCJ9ō0iU"bKm=3^CJIBs4 {#g\-^%d7CחW-(y23W;m B]c!\Ogsmgf-էJqxu!)o:<긞ʗcr݀eQ@r c1i7-@CI ڝ<mq<;6p6{L1!eiZF[#Hт! p(joFI*S=Um0RS-71>@!8N N< wFN"E TY8H)m Ii$PM8 %Ɩr0 ں~=Y1:&kٌ^fGvPYS^dyc*E!]vpsC{,ŵ)FWWuv"<__N;pUX7d3=|uMDAvp^ L%FÁ ﭠT7!I' ;GI竢81ԣ]P[s V-Ût# vye\8yd\蘡C(XY@|HB >׷^vмXk )|otSlፀ oaKk 4~tSTzԚ…*QdUn2ujL.6_O"ʿOHOadk}'[Mr!If{iv2&ɂV+8++Pn v"L(B^sAQ~h2\!`.Pͽ7iD0P3_*Aݩ`r"rJ6Hʨ,N৑I)LV  %8Z1%%N'RC]s9&uemGiZJ4ɬ<`6+/&$?HvfQSw&ejV#.ɰ2U%J4EwT]:%FÅA?~ן_-+`#.Bf(@~o( 2ZMIwt2<fr}17EO)v蔳GèSĥ fnWo"-b/p~CdM넿*(22b %FJ9@&jQnpNi҂~ #ap) IxkZFe)A*Vb^jFf{#gMagvUb_ nwutDf`Dv 85,)@2%a30G*%etw }YY}\B>RFj~L2k2(PVew Zș͞Yg=;}}*b.EEht#[&Mg=i=π{S<2\F=qT} XΏ; '烢}Q쪺6'㢬1 =2w'zۯ4aܘj<%jE'=0~(ɧ^o3Pޭ$n4j!>d$KlΝR,/6g9ͥ9ZbDrHTpYH%RzVf%oq l>맯>} ?1z"o~>-Y;s_O>縛g s_dvyP;\r{hvyуT>J!CEU[[L@{`P`U7$y(dT򚣭] =rY=S^dQ[k~w/.vs6yV;Rϓ[$XyHo77cۏ;{AC8.ky_K64?Ň'sbظ񔤤 Elqqtmf%XwǬ}R[SB_o ^#pWw/kk wA՛߮Վu*9G *);kZS鬜%QKysJHB:vZ*Ac u֪ܔ7\J3YkiA5ot\[lA:/sW-gFzϽ#CMTDzo8K4QRڎ(9 ~C1~5Z,m)YDg>vVa8 p!kg 끒rWjDjY{nt(q5sƖjh4hUU=QKʇ<B8Lׯ%rȺ$r6`e,l2 ١L6SP9!c'Ulnc0j!S8f7*`UYF!Fx74ٴ?_!bTgqzЬSAtTJճ:B4'!T {7WC\UU oZFSfpհ)=Hdt}CVNqf?I5r'-ؓ9f8:XkcP6ㄌZ ҙ`gl,IԕHuQCBNFG6TBl1K(#m5XZa0B i"sנ#Ҋ 6*˂Whk ltS BlQxi PT@t[4sX?lm_GJʆiD){!W/R" bɋucHmP6ZւE!n n KK4dDGy ȹl`24~JoYjn.8(1.Pʘq M]YR;!K1@h濵<Buv Fq*.g-=uT4E}ko fQyG~f5NX_*}AU8P 6`Zr$FbưI&0}ke8b!z_4yȀgF~Zҁ4订YłTM%b6(a8Y`>f-RAUv('A/P`x[߂~44oUΎ]ƨmFLZ$ސuݣ/% d6Q t56bhh1L i f<%!07\hC.ly?M A˓lnӲ?H鹞Iv,: X36{nfѳq1e߯ U-cgG:5WZsLڌAγ9a[7nB(ogڷ 3f8 ԢĔ;T@=(UE#VU%SSvhzDGC@zu`}saskn{ V!5sJmhOU"G0^䍡"N ߡ0iQHRJ#x[z ~lDlد㜛1mullruw 8ԩTVm 5?t&0jj=ڵ-9VcϽG&&×O|5 >FbU>$F (BZ`)o"02j3`^ ^ R-t8IO3?bt~jKUeriL" kEYrԊ7[(=pfT B”F tufXd:BI>zcV/qv}5a0( @i-_ӢyH4 釿}AϨі^[X$5Rse%@[Ilv8RZͿ;3tqKW[lx9j܋W[6C}j0mE\31ֈL\<{߯폖*-Q_گ j7@W\K4*-i WQ?z5^z!uKWͶfQNlC;;);(Qqq?z3c^ wV.`#Ӏ?ho{O~q+lԪ?!w`QW썰36۷cp\KNO>ۑ l} Φm.-jzx2䪦<﫛 0ŗ󳓵V­[rGN~ܐq#^;I8.S{yi<0yWϡ6գES@BJ2YF#K ۧLxg,%5S;㌽;pwyXeZ/GB%5|§7}!m_-ᏽrY?\_69d؍ Ss:39L%Xe=SQ[,ۓS;v^nFR56Zv*a(uJ~b`{Z8;v9;m; Yql֓-ZZxk x+԰Q;xvMZS}JmAdm{jF?I4!j͠U A0A"UFApZ8ԟ}'sc|.qZ~h#JG|(]Q+$֗}ΆW!|}:]Ha*Tb O3;pw:NuA_\9]뜖%E;/Zak|Iu0C7 ,`&kM|PE7r 4)H_|\8~H'ha7vfF"n=0o]Gc[䛬8)^eŊ/v;tAtAtAtAtAtAtAtAtAtAtAtAtAtAtAtAtAtAtAtAtAtAtAy}ЃA$l^: #zJA}f~t9sr>%PjDu'g%zڢI\c^ȭ8X?%yDYbOd~T,-3_.s*Bvf4 K.hĖuԚ"ߺOj 6Qc?-GZ_\z92\k{Ѝ7!'raFbuwXbQ>|MgmXu ޹IIr5P@ZEQTf3MSc9)sVY`>>n)?4c˯>ed1s!ɹ9X6MGy ܆yc&_i0n2'XGY)nG ɵ#HУ1$L_.Y+Op:UJ @iORKpYEH(򙖜eN) NwϚd%i?/;ɏ,M=ɓ"uD1E8f)J6&g̬ĤmcfO0,b_ ꜰ80 ,O WM-ބ8s>ߍ'#Br9m2A@̓f1>,Ef L/`;P ,(?c/۠`r.B44&Ÿ0ҽ19EY&=h=ẑg 9lȂs"=*,ŴӘU5HU'Y=#פj8s\&5@ Q+%QTO?φ4Yhs8*X;<;ˏٹ8x}-S{_7b)C#ĨpSx64>r`heDw  7D~Ϗ7T\ ? W\bV9̺o@>ۘTz;e1X' J=SaVscX@B#|E# {X\ ]ԌcJga:RU_d#?\LnnfLnܧx w[18`(k _6f Y~\3?Jjb?>EfaX@*uB?\<\=}5x$G_gCbo\7ǥƽ?GqjiIQ"yOΑ.5D|6ta"LK]jx:}TORsi@͏UF4JJEu:Y <= W.deMuxYU^ }}Θ(s18xE4. ՙ`W6k8 lbC]dT1uWML|!K[jq-'-QW,a^}P}}~ӳ>}l'Coy=GobylMzDJϰtnj|0(ErEBb$.Ȕ}>CjX$Til@] F g %e^x*v% AkXXWS:%x$sIy4̂<ٗ)gw|ϫ\b./\b\S޵>twſ; "H!O cI٧ ȔB T ycD.a))XZpϒT›MCav?}' $e$7;`دJfOGɟZVYJyw&_+U^Z[t]mDjɏaky/ B6|dcsXpF[wmf-0*c2KFDOAs=wB&.>NȦ~*:ɓAQg9Ny7\g`j\̮uTSmˍ?Sfa$+f:Rb5ȕX$RSl@_ AHRVa Sx0 p1 vFZvbkD$GHBEp1(1=P F$ 35̖! -G$I k'mi{v-п ~sdfˢ@~%U͛aI'"]3\]+UZpҧ^93GB$42S:nY>]}u;͠z-%@/&C2|>s)-d~+ 'pgAn C пatfXfۀGΖ.{Qt_בEPV1Hq`% @@&Pn1]kp1]quř1uT))sw/N 9g_.f"ze?,f;y&{tAlo"_UM)WZ]kɽ]'E|v65o:Iض6_]?ݝwo^8?h}~hh͢A3;JZ7>iyuƫ Z^s<$WWhWcn"`(X^&,7QQ9Pr[C1w{^?ĕ֡pazOy8Wmzg{ޚ=o/y#%AzcmG,Up=m* 66ovZLs콐i.NǙ:Occ*'E{TW" *oߵц{L,c0!eMEX&HSE57V"FMsFV;|rݒ8w\^*,ӄ=y# # 7͆X5ZwdCĠT+^a}F.^;D , EFCk+I&@ׁ[q$;K 5L )556w  J:B QD$+J*T@s9I04#9i JBAAXtnqdDԈ H!UR aEFӖ4ԳB~jVvT F)#R=V>FӒZ9D< )AS0ΦX"eUMxV2 ?L1Or)e4!1 "Rn-SBQԌh@A#06=j jK/{e OS hVF^`.DR\V GZ.!a#bw#Q5l%#aI\0IuSeBI@kmyDoYRe+ך|e[4aq)Y*$>(SN0Rh$FT+jlEG !Ֆeߣ4MN/d-u^o )dR`1RuILfY.ǓPG?0xB\WW~-nAu0qyu0-.w8_fO~qPF0-f&ٗRޤ ]'"в㩄lϧ ~t(T"u)";ܛ RI%"%R9dĺټ¨ caAPA,%"Ш4T ]*IdVSDmeߺi_ ~Վiy\nQ~&*HFD' CF,vsj>1'5ymi8g&AF*^]/E^G*#R}&TOq L*,(O 8\;V  Pvem">ZD t@#ItKj10O:)N\dZP}2Ò=&~iz"5.:ySgJ񉠁ef71AۺG);gaTyrɃt ySD.V J1u&ý}`/HgյJ`Vtb fY ˥CR% Xm~Lrޓ#W״0`$JY?U,:??usujiЗ il^q[^T7Ljͣw>&7o҃WMyl 2t)2Srf윃dR1v}!' b'Ɨt inY~Q1-z=ٹW ^ Zk=Ȧ^sg3zĎ9'#^!3|~hAX4^*'ޕ:qf;`?oy_o+L뿿:{ i D;!,t7tHSIgDUJgO>7՗#tPfLN կ{3lqBT[POUl ZH&-*6X_ErC_?6[D b;5x0C]b#;$&iEV : a ZR^3vs;jrvtØ UxG jSPT! 0XBcO-:rsPí thd;ժcgI#0XН(c{0;PW~rC (9Rq!'Hk@`L+5,TZ(U*ǰuF4}kLOFm3K0?B cR[ 4 ( l @Qg J8 ,AQ2*6ޤ<,I2-cu5 Q۟t]b2t}T+1naGp>) Ym$}Kڼ}?4+/gӗMO?Sߦ5tkPl/nGM3iZD=jZ8r(7]ЮYoZPYP`T,T`] *VK*VQ'#R6zJsL€`),`Xdp;#Yj#µꆢcB)7j 2H" lֱ,| E!/B"Hny X CuʌCZ9|9Xb#*>{osm -qi_ i@!lm0[B1k Cı@-3vB Q lE  ?N9u ^?cs[~VK?Cy`ɶ|UϤbucZuF l*)2!+[l']֓Aʵq R -ԇEu\[ ùA\rcM˺bY:>7+NGS,{cn:{oeRޗ6^-}lX^06*hx .IXy -44VNX&L(""/Z`sQX[4xJ =f1]cF)a% v18)F I&+,0gYJa1n˂5t G \;g͕#IFۛ8Вތ<ͫV w0]/;Ώ^L0ӂ1,1 u5\XP"`0W(x<EO9J&yz!ʼ*d) &b !ON4`q +L(yZ-ևEÚ]WhCqSHmmuq7@%} !phy]Q~.#hB(47y$4"Nl:i%)2/:ۈbE>L,A9o>xc({dqE @\Ks /l.DE~ѻw߾9OwG N)X,p1]ާhK_><װ~LA;NA ”ގzL{!T^[:_ibhэ93mB۟AUMPjj>=b9yxۿ3&o ^^7M:](lY񖂣 $p%H~(lG@M 9dJ!gޯ;C1!|[7VTOa?̀)I#fO2p5|OMkZͭJ=MrCb`}=[q[)y3U2Lfƙq.4{, 6)DzS=n)3-=ߵWLCi-8uj+_,"'c$)N_ft]?;`cTk!V9B~4dw&:rI\wUF\맡C o5 J,ԫ{F XƱNr0AU{Z `ƭx+1T%(NbciA3"H9|T3Ws}z)ޞ `@y-T( Ǚ`]c 2!EhGM+.сїkҁ[ A_9V`#kc8  k`Kv`ނ1 cـ%94= V^@ʾol}5ݳ/vcEe!߬֫oV?2%Ec-fd|H>\u󞽿P9cRn.Qjp3b^>k0=~_0a ޒo'-F HyULos[yOaiX<b܃o\uc~C" BID[O5 Nہn']Pr\<|忉+ LɕTy9ܤfSDȱ17Mgr L؝8SpfU!H?%El`"_XuXx-a*(㈒k:KaX`"X3թ:+T$ a:/O۫ڤ/N&'13";? a>Og_ O~2+3O߾ [gn3섥hRɏH DRx m2Q:=I9INˌ,'_UK2)<[4nVMjin[<x6+}2*Cq6Ed%OI; U?JUOQiCCJng6}\pzyv7fp`ؿrŕ'N~ߞB L3hY,`}Ћ՛7'YY| :i]kn˫kU}&͹ՂE"uߊ; @aƦ;ݲwldz j)PDV%=o?ƥ]z6Ə!Ɛ°9᭼y('g'ɽ@41=w1~}ןv ^IAuZߛϞUE]cWD=eF~3ژHIޝO*AI%ѣѠ5Y@4fUgʉRs:&r G31^[XDQT,w^c&+O'tSMLڼ+/ 1ʰ\it&T-:37`+:T l\*P1i@Dt8z~5G9j.+^g[O?u~t#+OϦys('idBkLH_S՛Jdu=n#?e&ځ=Ed80HɗDTD}t5hJwQsDF3K֤l?Er#5 Dk "cyhM>/9ޏnnB C J!ۼX ђ ЄAtaE4Z;0&簲'BNnW:*vR7X؁ a)A(#bcf*p @p֜Dn{<ݭ`m\+M̲|<tnRѨ$+F˸ίq#g'n0 Qa8s죺`),Rlt&W:*}UG%Bﳴ"$'ImQx3\%gU1YnDVHR?RZhkBs,aL@Y GʶA-)yLqsYșE:/xa9sRXF w;+>\*Iڀbeuaϳ2]xX`+wb f0FhLBĬNBj8IV> 22ӧ2%7Kjy0<,ܼ2!䗖 9K)$e))24vcFH_W| *S{6hr^/b$VH\^{Z/,d0C\q@dv"KD&8Ɨt!f6q}> ¬*+ FY-bE>ubK=ZO9Ƭ⤒c"slZEzKew iԔ" jrbP*L:H4$e{b.²hWZ,if=.8wPHa#** L"6ߢN@2rE"i0  ҂a)8s]eB%HI}[N? ~nf.bMu7Z],1k~yW:QfE1 `(mWcl^6flǝ֨7z"dqDT Jq(%y"s_~)c.z_q(D˾Fdz ԑ/l*? 8h~W+/}[b1`RUͧu3eiɁ}+Q%˾Fd$:%̷.?%3\0q"ͧ'RQKJY>/b|L&`)l>-96ǖQkr=7Dnev,-JfgL .RVPX݄LT`0l> Li|'uǬزp 4nvZ6Fatbi>*!nn#RQiVD_/oťzR6p=rYy6(_J+ࠓ"¬ @ួNC1؜2S>@|y1*m|Oa/~t3z{MjZipv~9x58"|ZbR ޼Ws<ʧWl1UNobƝ]M{u>PjI+_/poU Aqҽ1fJ(T1Qu<^<*|SK8;[+n拌۪7tuzTJ{DF:BIfMceܢSjn14HJEg~aF5nFw‘ {B30%|/4fU\G7vw"jϺj_?]D^l #lQ`c)%%@01\WO!,? X Jx(V ~T@rxa{޼%Os~\#6ؤl?6i!H~`cX~INdsqv -ŨFJ15\zF*w>1\FH⨳"*o{PTU N);;@/Voj'# ]J)JT?kןT{_ 8x٢(߫ E ?8@ևv|>%B5h3Ѡ3++-Kg,(-{ɱyFlsP9_`19ⓧyRD}A|mָfGHK2N!5K k,WPf޺%.wv0E`?6rI'!cShi#^L (<B(Z0ˡDy"K S 8)g &ّӱO!G|t0ȆC@RRX<0AYD4D!\ d*3Dշzh<âqZn˻J([]my %$W@}Kh?5H 뤽 NG]N+pA&@,+Rxey=T;:q#zGIZޥi~]y /ޗJb6Y'ˣ#rC/zur{Q6ӣ=S=MMB@_.o.YQ tR>rlK=eѠp(6g>#Z\U,P.p~P4\HȪ]Ic9UZ1 t5/ _woq9dAO$r1=3#V󆿼}؀hO"K,$[ÑNC(vOcʃ'$mvH|V2J PrWJ/`Ně?(i1Yݜ%2Ky NqLFCi 2`ʝCOoޏ 1SWzp%8@AqYsάL0#0%-j?h$KF8 dotq"4MuێNe)GtBD\/|IHa36U(x*CBIAa˞1 F.k n4J|/݌;c*ٚ4\⣂Ϳ"$2ؔgEvw9Q,+Gu:Tڴsȧeً%${pa^ ַa sZUڌ ޺Ɯ \F6} x0Auenbqɒ$K RDO^a錬7I\Vkz&hh/)> Hr''PFw.>M['`eT .G;)$"-^ GP #i]ll $0'=HjEl$t]*|8D܌ Z]F(>5k6Q(k&zf xi>7!{+6**vQEh( wLu|V^kL -82ix/<ć )Jƪ ]CV8NOބJFe+vYBF<ItHe/h>'eJco`'c@g&Ԓ a6@IMO I`k1T= m# !jF%6ΚRaZ5x%_՗ӑ΃w"`.I61cLNz@9zJd>+gވ 2Ixt^%^Zi$׃C̛:h v,~BO|1*ٞZi&3$$udȘ09`a b,S/( *Y͍7&Vn “7$G ˱w/?8ꮤDmYVO$"2M"" Hy$)^Ҋl礼gu퉮)Xޭb%l2DV~#@0 G ( %i,:Z7V$k'%[ l}Z탧 -T55#҃g@O7!dmGgЊ"gkPeF@?f'ckd3AɧOAUP{;~ӹG5N<h£5 :Riqk*4Q G΀|.ƪN06 ]fxgˈK+g 5c |e mJEph"fodN}f&ru}[ðrդT&$UėK$ӦEә sٖ#?/9 v% |'dJϟ g*8h⭙jE L-C.(?,jjixTCƑ`> .4g Ig:C:):;h Q_ӭ7>UQ|K"1˛[ʕTO%[^HPM) !Y~sI;'Bf%o^AW}]˥K@j7k<6MSGhr"$9uB)k<vT&΋)Ě+h'zH |w+({Yv|̮|]}⨩;6 cQf"xT3Ђ&w~?NS/`m+wyJ}!70֪?R9yL.7I`[6k ~r\$TٻqkW},c@y.DJ_m;_HIh[)Sl9fOvEj뾾 ]$CVGH䑙*qYU!L`E)S9T [e2͟\8G,n 2qۑ5ɆրCk^rm8˧yL'F8LE?Ͽ_݌n. loOmpJ#@Il {'D Bʐ ͑2m$ݻ"+Xk a Qx#Lj1}fں#R@ݧnx0Q͕P]ozVˈbakBw8;ch-bDχ=g^1z6 p0D/eZ=OQJhcQ1"$->'JN2$n>zʩ= @Hʅru؅eMoUr( @q!Ҵ  C0Lߗͭ(M ͊8/Fe}( p趉ܽ9Mr}/ \oVL0 &z$U(M.\āMU 9p5\D1bh9v>l|<(vabW W ׎9"(ܝ֕?UտOˊ?du^r1crRMٗZ)x_EqBybCD~bVx0?n/\hTG2v#R _`JRj3ߥab J [[zlqc;^ĆeAfLgYq)gh/f@}I|8y-{'FܩDoXq)_δ BiNK  UdEƥ$%ɿ30>Ht3E#"+jYjUn*&gTaւ4sfo˩ǥ"\} qL⺛w#5w%T MĿߠLHAr-h5e@@)hVTpz}=~oNNy ˑpXhp ,5غвȬ!54W(Ǽڧ-K#Qܕ=ISi]+?M zUHp|43 K`XZP$Fr"X(%H}Ǩޑ׌pjX˘TjE?)$HYPྂ Qɰ)(Fqԟm ư]r쑍BԂcRd0l37U0\fDϑdc=ǻB'׎Y,Ùdиm@Ac^_d̘1(ΐBaȏc0 ')XM&BU3lyH,(^"ak d&Fϫ5XdQX[wplv4+kH^K6eɞ|e-Ql=Z#3A%gVCcA xqc(46b82XVj.{'X e3D"/$hV9fDI]?@MZ$D u>{$Vcxea-zg1iCXUCc(&^UPHīp"EzA )鳶j?ia Mn (0;X\s;І~E+$ͭBP8”5Nr*|ZÔtv-@j-d3a KZf"($sb 5dlkcsmq<q}J賔W!'.'5LvKjY7({7AG!z좡oc„Q0%Dв J^_%rs/o?#e,Ԁ7Z <7cA $EǽljdIE@O ;,󡈫g#=3*\2KŠ4(@3)9dƆLp-EiJL#(F.#/1oB;5Mt(Ƽ 8Ԩ}pH{߾bG#O?V/VOU{%WS(td=a{1蠚UVnEٌv! )Xy) a=Y^ͪA?(ȱB0z%!AH.G5b;6]CFҊW<"*K)L>`}E\.)f+2)C,2~XPpLS CCxkm`;]w)y0 xi-,fo`2 "Ժ.X ;,G&Ԓ[' \g/*eAј 'G!C{ŦH'7 q/>(EkBaIuW v- A'#E|=([&;*y/VZ:^~ q, {,]ڈ! 2%IAwz]ZB݌;L/@~#O]_2|!9[_y?7i^氭tzdlZ_L4"Q|;#^X;5?yvzSyˎh:VRO*1ɯCvvPP[M=1%lJ,f kh -)fڶZ-{A54 "FxuЏVF8 c NPI]Ng?Ξ[qܕm|QDo.U%8tAUt؎0^û˂m׿|[ Xq{cI߃Nb*98ZqrѭsѭK0%~~Ӫ7vh iqIڪ0na7xcK~Q4ʏn5BQI3Ѧ&"9ю y1n+n qLo*kEsGi|\Q)4=< ;,4$ }jߝ$9⺳A@evs?RFbK_\UsD-szy @fjS?,|?]mJJcU\cU})Jȟ,_i)_䃗QzZb?D+&m񊵠bPw&]K<\evߊelkqYfZ|w8ely\N[+8o&Q3iE A!ӳy:)_-ׯQAX_9}be?rĕ3I30Z9BOs@E5W<Ω&|(T.$D0Nxb[DZ%6#I^ky)e7B=t̞ds(Ä1@!VK}1gJuw,g,/׾ʼn|Uw-]sˏ m \fΓ}fcÊ 0,G 6]X^U c7v{L%}:OdcAĩC8up3,`55 hbexq@[V\>x֮/Pً_zRM#. Z1'xa4\~Ls+PR6o7?˖bfxhS '=pf#.ڨ- ׼pߗO1 x4wXH0}_6dKp78 +W2HoVL0 &zu('[M ZlJ/^g l+Vp_"6LQ=[`wUxFku.JF}z7벐Z|昽 ,P{{)%$mޘ)hkNBQ|۪RݮG"c~#ngw/ӗ_EǶͫ:JMl_=׀B3M=Ӹs "N%jvȬUiWo;{ !Tɸ_(DD2Μ\j%&jZ@*2*~ CU6[8;ęyf̜{3C.yRF^uv́w96$=i]FNCeF*;*?1$+eJ`*n;9 @'D*6+˱Y5@ 7b<!Y$.eJ~h.~>S /c3n2ȺxA0v<9glCo͂ݲ3 ($W|!AESi5O3ܱ"[X9 bXvasxDCƻsQc.tUO _e4(H?Zҿ]Tas^tܫ3(BtZLus]ڈp(Hiи+Q:e冕Z•l brG)\A9=Sqq=Urp"vXx30Yy(? a2T2πᔗ~/Q/1yt|ʚZA5A$1] yg+D2shuڠ+154!L'LSYK8(JAY.2Ɖ) nU7HYdI \P܄{HJUի^,e=jh ."ָk8{&/DDb #Yt rz0<<[2U7Tο$3f7(R%ΐCNAw˸uJKe j2yBd!cx&!Lbw<[Z:/dTi'+@BFZe|H[w@ et%@xeM0DØC%"4QP *r@93i:߆ykP6ڎY&?H]qGH=&%bvβp2VcP-\kXcpu`aT)$/ŤYSkh IUK Pb$ kE%[^UipQެoGYR#,5*qlOȞ4'6F˰]ZZAKpr7 Q[54!!\b]*$_4EE:Z} yx.X$!n:P*Y)&j$J˻Ov-_\$s|C)*hG+nf=!Ƕd v$ˍ .^fP¾TD OOC%n"47$3Id&RY -?dtN nF8];& vv^7.wh%^~(FaK=6:*gw*T1Bl)IJJŲ"|TJ*<'k"@cs ݷujSKv :?Mf GHEf7TUJ&gBk<'ty+kmfv?u ߭m>Tc6ݥhW)n%X:7GN?+Pc'W/>hƗxU$\&CJ]3I;j=>kRFN6g-bu>a{Վy4{~y :)*[AZNcIJ]4X ?f90t=bYl$@<^:S2VntU3%8A% r}z=5 :T-=QB֙csqoO9=aaO|5m :!x[pt*xJeV7d,ದ?R%M|g|_ XVqf곫G bԏf^߱5[h2ܿh)hi\;-!Tx R)"@f<5[E"L SVWcy-] Eݡ7 ~Vcޟ.ηW(y=kbd0vH|I+m@IUW rJԷޠzeT8eQѕh ׾K c*{Vr9܎W_pn Jt3_AƆVFVRb"N)1Эk|p! y0?ڥCF!%eq#@2̃aN_[0𝬴i)[h4wkf($QkVђLp̸IΗvLZ`Ϧjջ).qE%V=־gTiC^Vp*ѸT ַe|:o'I^ո(@$\v@"HY0 |JX)'n}.K!vp\`6C 5(`$xW}Hf}$a2y%-#i?٢pgv Wx`yV*\ZE1I" Wm+ƴD<ZFd2K*̒wAW Q&)ccw%:R4if%AX+FwT[qt?sa <J.o$|vABGO O y,@ENʨTD$K@ ])H}T g.7pW"ytw R3a8#"Ƅ*eFHCԧb*0^mXV~Wڲ1[tY9 E*濤jg{n_̱]kػ=J("Zae &>T pZ  )K?'Xl*p] bd6Khs-P}K3+j c y#eB`0չ9*"#dwPϯ>ph ^ki1 LæQM3 kaӻYc6غ`pp:s<pYN_FĒ@PZ"DA-g.L 9^)0ͰPV?cyXhZ6Wv$?#Ei e|5R*\>6HKa~dM͒RIC9E v!(ڱ?4"\AzF$n-ZS78vرSX7 #T9ѮHrdq@t*Ax px JUοCȰ!}K:(^:*l2HT [.nh` (CYz!Nę5X]\ %[Pp]p^m>h/8~i#D&E?qm2OjU8+aLSEZDK' dRK*e4>h=&dD/K6XNhMuS|\`2]ft>\D@4i_~<,ŏ;3AKVaKN)1{Ew3 wZuo3$1=-O٤alnij?NO{IA*h Zx̍@/J`X J D|BL l1.OD[! LlˇxNlT 3DDČ894EəB`v,X!ecH+=ja5i}\6󡡶l>. i0 0@쫐*HMu0 %=7^A8^xSA%zJZI!6|3rjW.bI걦ݼa?Y㺇rƎwlĚ #񒝗XO҉r[k4Ľ.kxR "ކhp[q VsP#]w(W1 V4{ۃ:k Q酶[up}lW!SX| Lotzpt"նb e(;Q̉5<*6vdSBTPJ=j19;i`hu xΊ1x1ѐ/;@ꬹ4.I !EB9%q]MZw>gZn7``lAK>шE^P(^L^I#VEሲkO0T.‚(Sԉ4^_ "3 i?**Qdė4Z~ =y]8 rt<\W5jlh$@rd!x -g ce N|iq-~Df,g/Վ*Bt;AgaPi'6prg5̺SF(&91Jl[XEɓIݴf9v,0 \/ *i L lT?CFb3ot6_~ӷb2O_&_ofXx ? {8X>ygҘ9a+ Ψ Ԟp6c:1>|]*srwoU 0M3*lQ`"~B BJCSl"ydZU_h=f }VN%WVcKWIOOUjX"JI ;gJh3.ƕڝq%8T|FKKݴGRQ*@q< *ϫ <3 qܵ~A'_ c;HHuoՓ]Fg]-$@ZmIAXi‘wG1|*et]nF4"49(qDX/y4n>PLOIzLi)rK2La鵽 pӛ۵_>F$_)VВ-M•u?ՇiBFPA' !2  OJQrn&kʨipܹ0P˥7 &Z({{.h`xuh}ւs1E: $[|x{A׹%;v-v=A7*yZ@Q[#`*AAh\ Ʀ48!L !nsNܽ9&r)!ک@fBd2ǸIDv DFJOA*ED@F2Jc dǒkOpPՃ;` &P(¹V2ٻ6r,W4f~1A_v^ Y%[IiC]첬(J7VxiCVZd&A%ܙ <8aO*I'=.X3hw8+ ܂,pA|h'8K_PiE:NƠH)RowcZhl&H Ft dIÙ$>l*VǠSI0KM, `EHK\hgA;IHaW}0,¶\f9SD EB'у5,\( AkVbcډ^O/ّ:!`&flS1{<'^C~]XK*y4+d8x0aHJA\| ?~ߋNѳ;KJ;/i,(14SjCDc&%g,N_q6"e~CcǗ[z?'Vsc}A2/?a4\O\aHZchӹ(M/ FaEOaK/gRHKJ `iFwa2TNr(aܩ/eb 3%}3GlA`HPۛ.AmRf5V Kҋ8Wi5LZLoܜM]i|!wHԽޗVGguHmͿa _\*7D_;< .mB)s ?s;-LD a,DqgC|ֹN:שZ:U\u QLa"\z3"I@$[[LIO]e#i\?f:7~Bu4Ձ;(LP0o)Լ9ѧޤι+jJHl=V_ 98u"hӖpļPXbaQMIdR[IbAK jd,(⠱Svcw&?Mh_Xu-\RdrF/q95A3ͬ.4|8D:X-A19թ(ւar8 |h7K`Q @`L: X򷅪sA TlB*kӍ*^Ƙһ܆l=<۔ao/'75O[x2uOC7)*h䄵Sp݇+=mv (GɱP ogXye,.i׆,zWHp&T4Rq"U5\t494A믦c֥wXuCM( 2!UT n ҅3␉J!CUpIa zWe 9Nܝ+wΠZpS-*!3Y/ea"+;vp9VybѕƖ^6QH't^"et[nC?@ݞf,_jZ%'&Nak–oh]n4V0WhV$_ariYu9m[g}8Wf}vT"tvfke2˯ɲ W8IP~ܹY˦UQ>};P 81L#,F< sk5+K4mΥ}܎˞ҜG8sGuh^Yl]z 4 b c۬ʛ1}ZZ F)*Iw4 *\'YA4ZF1\: NŮpp  "6DØ trO H(+ Dj">u :u.R&niG+3޼ (uM uW`6>0t@䕻jdQhy- %H$ԥ=(b2Ժ!$FbA_%`G?<< 1#g_,WX݂S,0Z͍PjCx+|-s/ز>]ئDIPs]9fczW`(,X@dư"r<'T˵' W7|2  "-7o F!-84H`N#uۄ9g0soW{E33l]V>deY+y)m57q8#em*){6#͗ĺ{ϩ[ ˨m_.3JںS74 .QIfQ.5gNIQkoKog~8)S]C$CѶHG7b,ۡوXFԙ oG[ӳ +h|[ls5n&Y&4e=yo`F,#p~ _cT!/$"}_H>.E]tᝠP4@0\uN.sv*5{-/5㬸I7 &[@|9c!trݛOf21N~Wz"g}[|[9p8Rsĸ%fKf}o)k|[A\m9;ٲ\Uǵl<]Nxl)Y?ݙO_⦀*Lw;OW2|yCfzS[] MrcL=hYhmܟf>}G ,v$eG`vzX'6](d tnL5竦fh]A^Aafx][PKl$7ǟ/f^QUg|ICc&ĞS>[@=g-*wn&'93жxm4t$L?YT{x<dX>9}%YCk!\(!skK`qzF7ĔXCHFNZ =AXm0(@>4:ttpTlHQ쉐Rav/U+RJպ!A+h2:xӋVEm]nPv 4]>Ų,N'[wqo6Vo?!XآK xM7lNf7ӯ6X4Ď~<G{ހ`]-IBԤq;^WO-Jj|Z=|ݍc3|xlo;^pӽ-Ɓ'kT€y[۬P4YFٺ;jcҖv̥`"ZUM[z{iK[>iЎm2?tW:uCWp YfeX$lTºSuGO/[EKwMdB\BT@ +W;,%k0r8ZW۴Fr@ \ɥ#f ;5rzO ‚OMbRŪ$:lE,b *oc;fJQ~WY8oWy~B#p<z_Ub}J)>\n(_Xm2j-أp6aZ3&K%uQ8fI?;{IAPh׷UQI3 FU𞖄)W+ZW \ jaoJVoſ (+Dj9I}ZhaQ~(KZ1(k$ Ap!AVzPTj\3b/ ™>%E@ m/4=uqސšF:Ɨ{8FVn7tq~1hb{|0sg#N{Rh ICY\'qIxх#fQ D!H!;Ж(m"lp Y6bV͎lBkG4u82Xk.[zZxk&k@7G/lU+jӛ;کWXg15>fh*O3fFL4%3B#B"DȝQ@r@08xJI[R^3Ik.^mdĄHb K:bNJ%9eX4iCA%7M3F&[Bۓ{@[謆(IajQ}O'iI"%K/1UF/xdTgSH'Ꮓ̅L|L/ q wnCzޚ\~k0C \8S׿trR(_QX/}q_3)o|q-0>NߥY~0j><9O5$9?9Wv&VZG慱dX\E3!晒ZVk0%䕴~nYNsn?_̦&LS zRjYz=ԧ.ZsRDG`:MvyL4D(:Ɲ>2z0$ܞ#] ٭dEeTfJB_x}d3"ɋ8kTξ&dLHo|,(!Ve.H5BG"ƨ鴴Z'z!"X*ԘP|K[$n1@Vx}>8:Jn"$&:ʲ}`IݨAt&{˂ͧY8vNlS kA)]|s;r@H܌s*%K`M&CB,eJүf+V՛#^$ky@X>bf=Z„Ҧ`ZIfAR|j1@6o( 7b<ILrpZR&`9.&R Bt0,ՠ+r%fjSTiTT '(bnt4"EODL|!騳!2KK E`tu`NFiYH:'w=|DcNd B12e|c:^6LdhR7,z׉hIj9DY)Y؁ACTjRȢ%Iȼ"$ƥzJ\+^{".{fDGq9'0{H% UU )v^*1/tei< x6.4E A ^gDh4-X2̪{J5^\5ѩ&:=B;:](4'-|~#ROS9Ekzڊ/ JTϗU 2)T.Q 5ᓾ:9&6^Y6[N.B/Xg" ^Vw5IPa<ҭk7X9fꮷ}oԳlUY;(e/rb qMtCLmgr2eWТjګ2-mD&4{K6 Qj5#u-m|f>31unC6Rr _vF+3_?oʼnvx^"XFS̗__Nc&BZlYArN(hR~B6BUrD]em 121FgeY%:'-SP.|V4ٿNI% 7᝾/uQLC~Nwmb%o_zq/AJlH8?M/Yv`=Φu(g)*6*[3;C0Z(3J5)v$ɣgv/E}_ޝ:z:nu\2x%Y"%pͫ];M2MuE~7㗉 Tɨ>쨧-с/GO{*&?Y),'>\'}%IwƭAgEf,"KV\|هet +&wG6Z w-7??+[_uV1WL,휞/wB`d8¼c[vot1."ܽc$ŗ |>Oqg7=lEk-x9+^npQEy{ǖ Q|] FdšQ5 xbHBa]$v^cf)Px\@QS#æ"QoacQ2VQh$V+R7畐u-HF31$F{T8jȚEf5d]ͱM:E+²?\ ?4Ru0hdDyI6ݚ9k>[EƪfGpoWok<~b}QPmBА&`z,N%}Q7Վ:4\Vm=C\m@>O2ޛ>=LڛeG|˃>cAEõlX؂ FY57L;_ ˩q1VY-g<| yhnh${M'=V# >FU.qWY`Zz&9pՔ>-iv̭z :쨪>0Fکv9e]#M [14ӹ9Mjzܶu+)]LHD.i!Lܮ[\LZ LR !ޓ܊3l!#"F>pdIO',)aqo2JGVmҏd=}]p al]Jހ5)@ M ILjoRC"ڏ T"\sAqTTUX/NCQA|^=P]-z9Ys EF)&RN` F9j4m9^QUNU:UTjb|ʉSdnhmWybVW-jaٺTUۨqgBSDj7 Txbm0(L3dti("̒YVḾ9r3鄍t p KC~YU'PƂKh:<8̥!To:Su#"rշxW:_Uq^pƵilKxlc48{J2%)Y \a??L.& jbLza+.t] gÿ)"'8QgX|AY-zEߊOk5͇ISј@{L%d[;h9rlZ!t4}"O7Jl!X,PϜ(T)`Mi;*]-[L1,vDhB->3VoR R̋Fzm9Y4{^rc7o}%x~Po|sUW^u*{Je?jDa^kz'{+kWߎ~86|9m3k:iq/-YmMkvу9w #͆pEWC?(ۥe}PoԾy& |3NO,țGy7}G6Glyg+mkP}_z~rA\"Y% Ӕ9Jn \Fpݜ:eDŁ7eL/ 7l0b"4.d?γ]$7z{fkmU\M/:E̦kMUY hCxU]4W]ZաPB?Y䉚/y= N|HE ^k[T %l'ե{v d߲έfmA嗸Xٖ(&')i$'EvJKЦHJK65&)bH Rmo3kVFwiN*#æf_W< ֦rxnjP> _!ZT0y<-+5rQWSP/b.$Jw oW޷ѻUoG?wd,VOD_ lZ<{N)?=CGUwe+%}<T5^Ku;k\%[ M*h,X%5`yZiSjJ%Rv'V1s'e 5&[0ƀM1F4@YHT6٤NWby46:C-yђ`5&H=u=?MT- kDiǝ7RxT ެ;Y($ijj9m4*1ښ%|੸l59 DđZXbDnҁ%fsPIt3za7 >v|A9FΒ#Z”! I_]&L:d-yYheTR&֠5R41CF >_tX@T,ɧy ~m֕ν1>FFb٦dJk#rMj5כzeFԽg!k ϡiYqzgD}-Ɂ{tíĽ:-/E;AgWTO\QjuH6yDQӸ8"ޱԴ5&vuK"Su>3lMqL9Nyc^sge@7߭׷ѐ8fH敼b8NT!Fq/tY(z4hk9U9F8UagEp O,7ޟ[.L"TNz(FT|#B'JLxv`Ez(*LL5M(o}tAo%h0"sTN O,tzCZћv sXKt;?k/V7-ӄ}o=;no鏗Ue[`#z1g]y/> E<ǬS^ߙ䀧*wԎr1j ^ u[*Ȳs^|'Vj(@7}v7g}Ss"öNRlD:ފy4tPE:cFw) 3!#i9ڸLo\w[I\x7 XaI.T`~!3Y9A Bҳ0og &:9oz[M yP "x9o:MMcR۷|qZj+}(פЁ 7krz_kG2kĩ y=ǩ~8,c`z:q*LQ+w nSqͪ@.^Y WL\a%t42R2vvnkl.x@w5 }3 7~F3"AOs"࢐Eg{eu勽8u%/Xn/Wge/RYov-iQ{/d7sI/;xeaf|M;ǗE˧`_jEK˻?r3]=3ޯS.3$߯l,^6O?=B-y>;?\ҧ3Qn@ɗ>K_*E[xA>;~L?)&_)@`LW .Fl8k_/MLh/Zd]:9r?ɶ*Gu\/]Ă_>~w 8F/0kFE*>5^f7??%?^#3.x(}J٢yA4!:s ><1YFA3X"n|#4s! cnn7ѡU'nό\ީ.ʛm|ޚ}9} !137s.:E# LGehjj%PXkiF;,$#6氻1e"xdiY{7Aj[]Mڦ:.*zR[뭔Y9,s!jLbG5Fz; *qs-IvأQը P2SubHM? ?v@:ޓWnQpvO"c_vfV rKyU6*4(|}G*DI`V-疹x5ĐH0p4a9SJD/Ud!(5)Ǎ}h)}d; 5%B a蜤lXs0Ofcjzk۔ ^|߳IbQijV ,fF3Cd.͇XlFf3|#4^R5b/#q9hZ B!1nF\Ok/f5ǔ{Rof9t~?'`iM0қ3%#5UnR(=/h/HiCôBE3}NڏTmf&KU;S$l :E)IyQDOS y3aܜdqeнݰnf4@%HQ-EָE: z9fosuЏ|t>;HۃO#k?Y(r~oW߾y7_U{?|u]^-JMmv($ܪ/>ݳoφ9+v;ާQ<>vl}Nc'۾*@\m`@`[1ʡ -e!`8Jh YAM8i_R<_8s;4Ž#\8'Y,KyXVUZv,X1q Au;]=7d\'y.VsI QLpF%)(bSX L_R ۻ<t^{fap=rbq2[F.sD3Z#dP,9䵏һ(D11e4k y~ܼ.sM+zhVf0ePw1wZXye1rk H0L@6>9M&of2[lO?GnYV:묍3߲{UC9bFFcC W;ׅT 5dc@mzj=Ea6hөe ;Byb!d'C!1~>k]23ŗ |LޅBVPOowNܙ҉~AZl3xHC߭y=z"=% 5=%Zև 5Dl;D݅8};Fpk~o}1Pa$AWc>Uv) ) bf`N U@Sit8Qi$Oc˨}bPMȵGb̮eb8% Ҝ(¿sg:>O׼|YIb%G?= }X$i{^-ӢC,\*ɂ5FKύXg+X:ƀЩS͌SP7)~FjʟCL<٧ČG晋Lbf1\wؾh K]] u`ZIiU 9+A<& ַ&CA^-ASK^&HJAL*h9xGǷd6h6s_#L,x`pf˜KLIbwGnC %ΛѨxI1ݜT_t3@T#\F^z4}7:7]@;ӥnԝ'MwsM>ÎE/6529ݦ;wӷ.]nٜw!rz~`[&Oj\{zݯy=GT2 Kn"s)p"-r³wen֧>ݜO;s::s:FsvUK@' L~bǨq7aFHOrvY Nc8y-XF])f0[CN)p--*Ժ_!M aذ)5ݐ󸱳썋{`ѬoB)d MI֪P<?[z43>$ʏJ.SAQ:gM:&܏?~&LÐM32VEb& \jʾKkALqv#>a:䐊diMF")I^;89xX~b`y䣂F2.s]7 h1a@)zJҳDA@A0L &# ~wRdг~=[{ &jQc eS90ǺLrڍ)| 5)̝V Am=2ЎV5U,(pBsod!kXLv!y4(>;/xErVlB9^+ܝb|w\\Җma;iy$CnJAiQlu{➒KD3Mb<jozYv{a|Ջhos2FåzRxHbxKx(y.py:TX .4% %dH-2abKUqy ؀ubpf!8IE*T$(.RqTCc`,tMĔ+soٗqL?-%SJ & D_ŏm$7yG\~YY){?q!qS@x,{{jd6?ZyqauC~{fk)\XWr5{Lp%˹wJŬ1TYg0gA0)X|0n0HsJ6{3mʀw2WO|5ABMriTf%5VOA{Bī~ܮL+O"wmc ߒ W_3ItNoϝi?tFg+ERl?&yɧO塜a NyBMaN'=ڜݚףĥɰ щ\,a 3<&43S({n׼fH"|#Ps%)y3x 7b1n2߸2QjE6T}2}WI}{M cQ~nX:t}ڰ:v_j?oBF( 1q~~AG7ӈSf> wּd3w}Q^Hɑ%ߓaNڠ?eΥ6'{gd٣Ʉb5~|9IEh=/x֤@2Dg.cX(@H?Baҝ*j?vS^f)A:o׼} f 5kXwg5רfvܥA>mFX37`O|: }4i4`\9Z_o+RTUFu7 nS6p}k7 jt_ks嬅*dru mt\oH<KѨcoHR}ĆZ\.*H ]nr7|_JП?'H뻧{9GW~ܓ_oV_컿0\˩xt _ٯdO}pu[}rf_{"ؿ[9n I܇ǧM|IX>u)w ¡pg,*H;ܾK h&JƴsF{g{'3 \sPAb)miM{4MճVX9u|:&ʎ:uHdͩkNSޠѭ{`Y9#^5붌if-NG躭xfSpYudfW::vuN`d\D W΅! *ICVN6ػ:e=ڋ:9ƎF׭|tpS0Z+@RC./ơ;m{SX" |(hP2*cteiOlL4)A R% ,{mixw/ǂV<5)3a͹v@ ~ z \Aa__U.c<]pxtYz{%I с% Y(Agk5NӁ);)N<#B߫/ab-fP |NÜު,V'D  L[9GJ]b'%[1boc?1)͵1E ѲBI0 Qbje4jW=ZEr.>ӻ뗷B:?chQd$t3RʄN8v 3 )DEJr>uFCq>Jhzuuh8NDU|\Q Ǭ*`o߆ $Q A̩ܓ3vQ6+n>3Zvzw6V+T!h%zLu$:S/ʑ"q99r)M}3'Ld |82忩9$ЬgYRCSCC,_߹_$.Nm6gX41hۊ`uAHsJBp$!f֖[ޜnfԶ:4#Ɏ&gFu:[ g5`*ޞ{uz]?>Ny`UC >0$_&S(b(^uMQ*fk\EbS\ փ.$(|B1>N^ഘ+1W^LwII#ʀQsDF`Gy;=@$UE3$ChZΧ;NeV,#iU3Mֻ_ax _t`z%O6?֔>& ]vvث4y$G%o iP\tR|ԠO}"ezҬ2#%꯵dgH,@?qk95]6 oM7Y7@CzZv`v3B{Y׀?af̲UI+D-YDɑwF)42Ϋ5a+[IXA0DTRcv{~X#-)w[dbߞ |;8A=eGt-f3ch7GdtuT;C~|ޞ.jw7=8u?e;YN4+YpJ>\TQ(G+rhOBeщ՛|6+#T[c/9o0/Y6p}Ɨ.k/t~l-Vh_'?~ן0XYZsSnI+~$=MxTZt 7}s_}Yf{g#(/_~|-JMOq§O_b]>+8^jPU GC9ZL\ZiQ [X2sO}}KENj?(s4Q+ ٵ cs"M*qiPq|د.,r72 o`Giy*H|46~h1aUGn6hݴ۸8hp :dkoXݜbۂ4H nO}M]o/F4X5$w P3Wnb=p?_قCw!6ipy;9x#z&^(K]{xOgW@!Ňt5 @%V|V]S욪bG$YfV@^ksVg>L ,؈S eǺNZ{廎/5]i[},]cTkZ:F)< /¬c7y-(a&<$T!~kP l9#٭AoPK >9wt3~7Kɗu+?~m'vw|%gT$\' 7lkx&pU7ORr8YuY鍑.zE@ ]28SaL[Y _789-l-}FR^r¤;= VÐd"24mZePqֹ},]c9;}[u6j{%*0[x]i?Ics*~$jCkafXӒ PE(z1kcffߒ-:3n_yl7'.>:xiFBj'*Y'^׎7 {`~֦khNy^t. \W5O.YPk ZS޽Qx>[28 7ut*:c"wht;Q+ cʋybe/X0uŞPqxq]05ZkO^Qڟ14.xȷ3%] 梬 Z3(/P 0/Cak=^-5ظ=5x{Cl<^Ohp}79f6)m2 Yג8Gm:iɲo'l$'/iR#|`[hW>19W 369#T.9kC[fLKjłO -~ݰӳEn2Օ4v}|==ę+f9@uz9'D7 Q@g.(Shl3w kC⠖s1K/D:K-dSd=P&ƈgrLuɣ;-,/L01n|Ʊ?.?)"pZ9]Dvq2&'?fn*톥2Uj6z^fWf ܠsYJG\E$C!b|^s}bXcp*vs yVtӅu?  [RbIwp |+_sQ͜2g;7fN("r/kLR]l :LX u8piY'b ׺ЈVf.>PRcEJE8_xXWJDiLtxkhrNYOtz/efF޴&ɕq *0 s02]"isDLlj$$$eTVPJ>RJTmx AiY<V(\*@[mx]ЮV[~t*+QL:qs a7qXkZ*|f{J1C:MΠJ~HC 3~V P!Ls2g aM"yHQG (ڷse^77_B qDͿ[ }/(BI#%9[(djw3O3B(i#i `74cFlN(F #7w{/obeܹX;-4u.YRR Ҧ[PDIWB5=P&LUE'rc=m55xBitig:)NLXΩ0uHv>@@2sӵ+1%hLN!X*"F[ 1"Y յ(1 tnO)pJC=J`})TEBrMZ#o5QI d#&`8ˣSkе$*E&LKܶ0[7w'k`%^N _=v*u [eTULP譋(O= , !7_5Ŕ'F m/Rut=\af*gS@Bxt1wDK7ɀ rq@ y>f`5h;Lk;smnyB)+ْ†IBàTS\F5=aoA[Y}z=SN~Nw(x%y\{4zn#X:= wNJZ+X:}I1V1hܙM{LZPֆHx(Do}\bi!@ސ]j릓T3|,qJ֭ @;0A"Uëg,K;{Gú۶-qGjŘj^q5h{FyUt竛Q.'8jH_ѽӡ3qw&[f}\)/\,${ȺwN߭5k[EDb|X\$xSnc_Roy}q:-(._\䎟\>IiTYY1"w!iAV@{J/n 7B1J -RK!j @ tGu#*\*x=UI"z׃\lP$5K)fH֣iXm\a)7+RxiaBγ4T@&KW+p捅imLʥÖdR ʙ-%ln4~mTB]~k#k-r-ZmfHi̋x$R뉬NŨlKsY/- ҒP/- Ғr| ɔ1w.@v AΨXb2iM|JnŶzJeW m.gBR~\{%%}̧բRЊ|i~ [š69|l#,I)Dq y˲ 96ǀṇy_@3DDhc ??^=>n:/h4!vׅ:dEѬ0ks[ЇJ;;Y9+K3tRx[ S Rkp0˹Hțxg28?Z4骴É'?p`&M,(e0%NB鷳G'f mo1t/bz-Ʈ7 ʄPtMu٧&B0` V&jJQt,Uԛw5o> oB;Kj'W&2nǔL'ƣIgI(ʐє˄"+U\6J fkMh=|b-^d~$Tzu ]ŤBOQg.:Ɲw8/Y嬃LRJ~]J[Pt Wkif ԛ;?"E Hk84d \/PA*UF8t{? te;X]d(ӽ-d-ɰd:L&\?ff/ k괩h:YYx+>%UJk p84wᙙ%3Я׋Tɴ0w&o_/v1Z| dsD|T8 d"s$a 4 e2K̴~+zKq@&]YFd:a`/s' BAȝ0 4ۍIϐ1303 ?RjSRǣV ))pTy㩲^#~/Ef=DeV!M q(1@)%t܍#)E"P]LAWJ?."Yӗ3avf'tbJq1Rd넌wl;=#!(Sjq)f' z;!9pc8{-Vhd F$͈2K1#,DJ7.hk |JJxSw׺:xIMo ת@#9k&J3*d=2L"VX9-,\!p9ՄZ[ e^gD9Dn )7"V5C2Beu䲌YG\fOH@,]dGYC! LZ, ՔZ8昶UHcAY^ۢ4h>_9c4J b4 pJ<\0P5Jh'l?Z`wg*!/?݆]T}wJhHCK"oSR[sV+$Qp.vAJ%hmg};|뛙Rj[~1W9EOq^\ws A`:6]7י5'\6s=6D`M}Lg< !j=*,B$i$#M0I8u;Yye܂3vv%B)7xct4ez?HH.-?TF.qHi(w^`*[NQxI,;_iRJ/~d=nAsh.]kc>cVQp9CG2tkwwm-C@!~Np$0Ne( w'C Ur! a;pY\vKP n}Gh$ 4G :!zȔ}Dj\vŅa֥XR[& ˡ`juK>$rcC00i'LqYoBI|(X݁j<;~[%؛^M1 bbpӫ  B>ϱF㍵ RO:u(36 X M w;Øb2- A"nrFyI )H`9~!z!+/%b.Q5S ͒V\-׮JhkfxZc\< gbԗ)nqv~7G_y>|3~ift -l@o/b'c8iy(n~ ?uy)<%歷v W8`dI?EXR,Z4TW DU#rWM#=SF5$bߚDU k pbQ ҀzjC\YƓy8leښmKֆnH~,1Ҳ<"E\1}EU^P9H_&j+IֹOn,ͽ&Y惥*kPbM:n5BY}}L^ $퇳Q3L#QajPPE%ŤOz'-]c ҹO#TBR܄r$Rʈt5W)}r=v)[+=}0^+nSX/\qy7Fh6v>\U~໛\| WxO_w 0yg4 ċ.*A󀺼Óݰ%U+dh 3Bߨ|xY"]qTPr+~?+߾ĝW10_=;uי7vI$\7Ԍ@!<}0n']C6 NH׏ӫ??tL:I2gd%y0__JWG ^ 2 p?}>e^a1<=<^wé?cp) > VGfx]>_O#EB/97tM/⧡nѳS>ypdbr//m֢¼7x8Z`(I94 ,KKӫJGOA'7*8wh|\L>{o ,Bd4t}0f/|< bsR,>{TaLWXD++兗V e%+?/x.h#u,SZǀ& ј&DsNfA (t!JV"a4\{%](\sϊ+;{\+GXXk,J!vv{*e#T8Wą_igvx][sDZ+(>$YrVAT*ʑ˲PŚۊH@IY.Q(Dt====Ck42Mo2X[q??ĈP1m.z倐<&Uis?e/:2n2k~8ʞ+I lֹ8"}Ν}><;.㧯k[pDڀLQA"plk-‚)}pV9OMs i%=R#r&s$S`2L LWb&d>̵Lf.Xc2-ueH#ܡLeXH,S3`2̅TNQ$&3Tc0˽i ugp/;85X9:/?\.˪1!7Z ǒl?*3 1>Y8&y$w0/#Z Y/~MnK*VSIovL5Ҝn3xvE}$nuh_"<ꉕ ȕWPV`pAAЕZ؟+R՘=XX8]-m؜@V)߮eL b]'ԓm2ZVlt5!Edk1j5'rAIg,wySPV$EeXr_i7emūhT)#_Q?/b}>){;* ,_''InG Od}'l@2$Vq^y _Ay|eY|TFW< ,lE'#kbA:EĢnjm}/.4,C2ʚL inA󤐥rYO rI(Ġ9ɝgK(sҙkG9uj,b8ؠv*S+6{ -x2'v2"RSeDsN&هuiMIbc )`uˤ+L1FqoRHX*rsYYfuC&rHv "@ kE\T#"($ [PWΐFLT.6Ky ))<=P$s*Gbya<6 )PBslC Ҍ|Mngd%/GglX9m.wT9?;BN dՊst(crinWX-ZI@+E9* F%:avL9%HhH\|]|KI{hCRd+Rϥ:`3e1 eX se)s < sZ9H+c/ As%!X8)nI(clkRs~IX5ef)󃴓6NR9!+ cyr w4S)h%R.e#4eeb 5lED8\q( @A["0C1 CԴ$C1[L$]l Ecc8.^\Iq] & *Wmyܫ+Yo^*5d9{9rx 0QW?c#P"C4_c/?82O. i+3}:RL9M[ {J>֔spz:*̯9gmnuK$ygnX%@Xbx4VƕkRbR!+ g*"IZW8?D1Ro--X7ZuJjMRh˵ss}o'5TMφTmLǪইA˾0md>7ũZكt^&,6HRƬ2t7,|ۂ +}ޫO'DN;TN1^TR.Efa5˰_rvl<2ftYyN&j}ZuP2ҙ tn srZK$*qj^9Čk$0 тy(io2T> .ݾ|c 9K951-9ş] rKB03Yxkr<)b[x"㌩c׋=Y?nOIfA<6 9s]cqkg ̤@8֦hg$j:1 [Gbr1v%$nԇU8h-}^6w$wb>yGy4b hg U-|p䢦ekȤ]vЍ?k"g6}Y-cհnm,wo5xEO_=!\pw;;@!P.`r(o;<׮ur9޵4da6L#笷Sz+O*Վg<ܣ^vf{K4]L,Ή^^ʦ-[VٴED/1TNmDz/X`I'KԲrJQώ_]6)g"xj5Qyi^d$ "чEӱ)!X>tںvqCD'&:F#M]l)y P\rj'v5!,g!)뻍vvN'H <,Mf&/rj az)}z ~2`?X;nsWOl@yIM|=?0ѐyCEa?ƨb{3y&9ltW8RGo"F$?h={<k&tP ~^{zktW96\S흫)6&Yd:'cUu1d [h*V8qg.K:~vGd|!%Mdx R"7l' n\YheÄBP;3B=:o'-%Xq);"_N2%8彊0]SJ<)4`0$70!r1!ݑ0\ n-N-YrR}!c.,y)_ߴ!J_V7o_Z, Nh>ĴkFѫ:/t, JIUjX+ C69;4bP5EfH\#˄lMIKmtB$Y{l im%qn.N/{O]~XzkGtb*(0UoV9Fi谱A{W2Y` h&ф໱3@h_[Vdח}־P-R΋Xqp}'O ,[5-"4γ.G>Q|M 03aeEэ99k5yFUwCEۧCYCH֙kJ*zP!V"{2G^=*68h>a e\[x#,u]atY˝\}Xn(b>u*)RRMS=`L'ҬhOINdnO{ۚ{|WvFwG:Ӷ*FӽyM֧"S]۵oYMX[vL0Y *E<f9}MN0ht$JܴC{"k Q ag׫RDzUcŻJWRm#k NHn9=^K;EH ɞڔ. 'qe[1B۸p1chAfaƣaN,;8+F\X^=c ąN}S$CvJE/uϞ%-Z`Vډy[m3lbzi}LHNovvn4YbMoLr9*\S?|8pT>-4G>|emTỉNzL6*sPN>q:hJXL=ށh!,f0;DӇ_~ۨ@Bi>b1 { -Bjͳ!*%VAN ֌ဇNg89 IS8}>(MqRמu,K.T5Vbϻbcs]~BF% >0Z TwPC q3ILn=ݴRVݣ2qm&`sPbGIl{K E@G~L'[P ya1~iwd)g(̋;%a*~}ZbxD7QTѐ0lT ZȄRz(8/ښ|}ʻ&̯!1MJڰpX/=J<QJԋvtU2@ $!4H]T3[F6dZ)kEϔL5"ax:KN1 wA=;û81@+5{k8y9:  A;:Y-I(K;:g kٌ>AogՌ],[Ӣ ۲ Cw4=yc:{{7w[6ps.ݧϽyuTa9q⍾x=voo-oN&v2u±Y>uW5… _c>MzfwTlwTt%G P:Ȍ$ /E: \~Y rIVHaǡߖƐJDNч`y|/uH' 4m1oayʐǻ_g {ɚ^<_T!Z|9Z|hqu0pB!DkV*Px*}KiR Juٟ25?-dr!yq.-7UGkiX9nt1']?Xn02R'54)Q1J֬K̋eC Z}kz8ȑcAJ2`p| t;趮MΖ ͇mzear5zh^zzʇ}Xx󛛻oO\i8zZcj /j򚗓}FF}lg8 n8}ˤ6vZԍy-j]Io#@B2T;i6:&*DbyU%娝jѽPzUYxWBC5{ U.Ղ#HָAFr Tu3\[O(!RHZ$ A Hx\*,u?ohR6>jrFI;pQ\!} !;"_KLV: t (fRB@A4T#D/ȵ=h>տ]^}wd0n׌GoiRIQ'Y/u2I4$q=QŸՖ]$&$[[K|tB8J69kaDlB (O/~m-h*'7{/p3Bւ<^E^ں`o9*VV3!._l9,"ư^ 3 Jt+,5-6nޓN>hq<]O[q2kTTIV4^(6{v T*y֔cgZ.6ɷtTtJѵpY;_Y=UÉF%|۵‰p]^7F]/4vwغI:calgq#;؍_ ZG=z] d%5Y3. kvмl>8oHj TB8'UpR>xFML]|JUs糛珋35 /5E=β' o.գ7%wxzvQE C|Y/_r!WڊR O9(yEu݈9 T_}uRiaĮ" JJZ[u_X3r:%J9d K6lBфDhP8)S@Ia0f;06@ԛŁ]ؘQnʄ 66(vWMHhsԄ ?RQ@Mٻ8n$|-QŇq^`۽ 6}`[2$ygLKdgb_ ҈MX*#2K>s<$`nF ?yFgM(C,#'e\is"צSVrP ,",Z^Go ;2zK~J6vhS9ڛޛ&<@!K$8IXHGgIHGz]m>̝g2P B!/pt%ؽҌf YXSawG6T<" ]@L +]ecDd aF@}^gzI@6-nŌ3hOx-J0"H H3e9JIL6@'z6b+u3I_$(OA(Wdh$#R$i|H06sFRgPl6hd[[_L<,`ӭo9D{}B!\" 3%)`&} 3Vm܈hcj3.h Mr.x@i:B!GkOMF@> ,nK$=k=LdR64JmB׏E6hk"}ߨޑc'&Dd2LXK{-ӻ_~7O۟[$6?~}7o9{MSCWoH[BjSx&ӊIDk.KǮK =;@v{qkY3j@+c$׳>l,m߹ GqpL &b2wsigR$2f r'\5GGYfl،6fypybaQK[EKab0*U'l)zk\\ȭ Ϥl I]qL8NYߊRO )|x,5tHRs4'ÙٷT檆`S 6(Uzz&jRMg=z9p` ߾3Rǖ^['ExؐiwvS:\\i}50NjCJ[f]Ĕ_vp}]yZ$J!&^Ȇ1=pOǃ?{ҹIOܡ`,fn̷btde#;ʈ,Y|rBoYw?/e.n%1 X924˫wݎ,N?ұ/{I,b0[{*aҽ"7%q]Ȳ?d~3Ê_ YC-E*^t]nT<i(#ӡBxum'cYO\F-VN_SG}!!u{Q8bb~Ǡϕ@r o6=7IL{U6C/FhLQG'lޘ>e$ƨ*(yA3Վ|/ָaط:|)7 q)"^.Kɦ75J"BB9@-.Ecg5ڨCȗ"5UA/wۿ *qdH6d};&HAFK#21:{sؾ}>yU۽Y[x"İmW޻˴ t}pu[d=ϫN&eܶTO풿5cWo7[_m>h}c37U|DwDST/7hH?2&)Bﵘ$.ۄf f=퀔ۘ8+g&9UqOqn)HV JX5jB'ݚZ*ExjPWT5}6{D֊A 봾FӊCҭymYt2OISfWvo\'C(\(\L/s};>ݧ5P3zIm:0FMV[s ̳Ym+b5נM瞾UL+0hca54猃xN_zU4]WAXX!=ĨMUH!5TQdž{R~Yٚ?L^244zVe@]1%3ȑdo2nˆgVl%R2sOZ~+ |"w2JzuZ?^ ^/ZZ13eLeaZ+K跣WgӘ+{B1~=*.D_A?%9ҫϋkWa` Z*{]ŗ+1J[,_KPJLnj3j}ܨS`>`}" \Wv*)8EO2!+*vWiol㒝k46֣ K# fZQZLv Q(xDkR Ha]jL6MR:-Y:FhKLaP'\-(#\`mroV'jΟ>k#5cun ~ ]nT%tɓ $9E0ˀIeH 2N#=0!S)ShpOrXçhn6y֜K#k櫴Y";ܪbIR.2Uv5F*zY!F@ i$9 x'3>8aawڬ Bl{iƐ0$pYZH9CIQh5a :i.lN7wɂTxEGK}AIã>Awg&yo|Ag9Yې0.MLI%ۏo^B3mm Y/mx;}҇.h)͡]__6<-#{ dϑ1@&.tΌy:O^Hި4>K &5]>P$G@-KOBnyMZ}K\]F[vf'C4wh:?t\$K'ȜEX>;d,m>jg]/g?_\^u/ &M4ӸtH̱̭tk_HuEF/gD(R%,&16A@b()19HcN^KvofoU+%ZQSxoOSW+f*,f5|,S֞5''!Z1(a2G{|Yn]hq8+g&2ptA<1,7bPTöUehEP^օYCrSd)DZ\!Ғ'"X<@ν8_򄃋7c?YБJӦsGpH-F9&8v |)ݕۀ(~ S:y4Op2CyDm4M5,2Ya ZUmmJя z*'# xID"ĎD>qkY3'›m*:Up2A#1Y8|О{"ѦH_3'%/sGWQ5zsM1S܋kP6@m/^sEҪaAwsO`&FqWA^'{[O/AY1fO)ŷ)'V }_$;~YrOꂋ14d߻j,NaޫH(:ǝ :k!2Z,\jNmjf7Z1uVv2l8>,$it%li+ۄ>^< {aJ)8b2KO2bi -Z ZY1$g} r61s!d <Y8ky {0EhS2=l$ 2s [#8eȭm2d6KkQjgeO.aUDK#P&)e;Ӧ:δ),He _(B;Fp$$I^pDyL_ !g ɤ"ky,&Da$d4h>N, Wf"ˠc da0%mX;YǵF |n?&R&}t1'mE{!qcohT  1Uoe<.jYh42?17.\ UB|U g/l u\c΋W)q SF# <,߉y\$~ Լ颯4*nJnCVZ 1pq]筁m.4)_2WE]b}Ǯ BsF”UR S W)x'RɹI!t) QH#S̑EDsQB XV6TK6 x<M<0̴Ӓ0tC$`EIyB-b9SXɘZaS똢Ɣ+d xq$Yk]iW| 32W0gb h<طdt~% 5! G~„N2c,# YBh< : ."MuҤ)K8؆JV)8WKbq]%!Z(7:aJR)xՃM]s۫|J!1QM:ˆ|atMK~bxMo~}s)勻'F+.ٍ֯ܞDDX{Z~8zP7)~gF~6.o&><:oS|DTBq4 y]S6.{$3PApH*ư\Ber|?" &&KDjVV{&.y%SHtUE9W}\DT.zބh05{qݚG3~%96[^6Z,gĞ` @D?]ۨǐ1=WjY!U LL(ItךY*d}KlvA@ǀDž^kG5mkX .a9gd{Ҽ}j0ul]n[#%m E[ݝ#S3l% șCLܞn>2vU9v~fb<4R x p[o<gK΀LZ:WbŦ;C֟^_Aik2&d-!ZLxy͉*~0'{[i%hܹ=A3Jo ?ZۦwjvI0ocXv8z5,:%6XC;Gk00)1O0.Sps%6J!ydZ;hQ%8ۑQ<YɨN5:8z~;1}6=ې۾36v'zrVqx6h݇C؛jIa龺7 a?ً廷~2QeT,b;jSX }V#TIpŷ\ٮAXlRf!ţȷ#v}E2e +}䐳HsRJ<F`O TTiJ)xJ҄hE4[\1mR21?ھnYM`suXD1 +2Qʗ&X_Ba6?eZ7Y~i8'@KZ+(L-;Ѝp#>{FLJ2lI2kmFhGmy>X?GȻ&]W\gdZZ1[pĖ~ ܪ8;mH1cq\d[㸢\o'خжZQ#+8. Ma)T'ZQL`w˫F 8p(6[u( 1`  @ 1@[f63Mplvfa%FrE.q`xFWMS8%VFR&CFI$3bWtjn62?Oz~Ϻ$ٗ05FY\gd<[, e} E%ӬZΌM+<5~ uX=#!RֆK+uŌEWF rʼnV Ex$;! S=TθyMBwW!Ajm 9bO/qax@@6J/O\7v`P1<ufik٧a63y(E3n~}yvnLϗ?\NŽ_tWV:Z#₁mXKBvvpz~7_z^GN_Խ񮻪?#$vfpݏ̃tc{. n B_O·t99Xn3џ94 +\hd2@t{ ;6Np@dDTD2+SJ3nKpDyA9I&7=ܠ [70@WOg%_.7`lVttcss%6yŤaܜO1%ӟM\OsN6}ROa?uמ{_{4!!^)w9QrEqݰt؞g~\6fkC->]_"frZ˟_]wfgT̈~9y[ӓ Vb <'/gˉTrnR-uB*CRgHN1Gmp`V铯u*tJ_4NO׿x(1WI;%Ȥ<1) }eLs-u2) La^Bi^GZQڀa%?=5'ΰ%<9 2AALX/Qv,`l=[6ሞxz6o;yZK鮈IX3 %V !vkmKrJDJZSqN5ZѦM11ǡDU%=`G&GJ0_8i_u@٪m)he_l}cօ$2aCA@҄'AQF#9KSL4f^KiqU" 5B1F?{6 uuwS+x?TdoR{3J"[RJ&w?%ZINJEݍFq9ۻ@ƬAFB %h YQUR 49Bn,QLB[I&,6\NQPrL^s&\z[fz!%MF4GOthʂ(0"# BB" jŐ1ZlLS9gYЛ+w[mNYt z˹T=d:͏8 ޜ`N.x7o\_vjj}x73kinĦ2.ߏvv?voʾ}TJUI㛃@< x0x~ x6ne"k';ذU+Ϧֿǻ,E~"Yn'S{w>˦u_\ۺ0iO^LlWԟg_=ٜ}AI0ǀ0Ђ; wZEMYⅅd!`i[P ``q(9@-µh3=F0wA kYS< #΍:C n6U({붩DI6mSb ʽ=#}[b%ie.dM,w7A-r0;ZNcx@[=p)lw XtRx Vl 1*ZY=$݊uBbF''noVRJUF9=& G{+r ]8Îgί4 AJ|(Y]R *dI>dPԚ7zmcZlOmckǁ m t)>@{zBģ'8I}뮫c}q]98(r]Q!J]ѳ*$ ';߮@1[!ѳ{!6XZHCQP6\^ Yv{M^Nz^kA'40 S)d@!"` "kMTĘ:H]vz-vcTt 6X ,h V6 KG圱/Tz4qW771"x󾏓Mw u8ffs.z4-?˜ ɝ!V'pv1]z;[ Rsh5!_9)yFm8d}nu1(:uۨ"ݺ=M >ۖnZV5!_9)dGaLP[] ʰN6Hי5-jtkCrSw+$ʴ.eXnU[(GB趥[큖nMpW΢5x"{ܑnI VRSJҭ; K--EkԩNg ّn^nu1(:uۨ"ݺߢI-+ݚExJ)yr[%?̳˩U{`o} ;hX!ph];Z5h K)ѠWo3|ԓq*x(')]3=g5rqc;uv j?)&<"Ð vCB *Y !X9#5~V0("a52FR" %Vjó4CΨrۋ|ـ_@z23X@ihGE2hXjLǢV6_NMQjXӍwE!"UTŨ$%`-KMBFr>KQ@s΁x.іVTi@iە|O^B+[j_3 Xbvay%`,jkp ןg5j4)M[\mcg 62Z:W#t'6mF~y3Z5cHh01swŧV ZVV x0ˑ#b}[Gjb~]2o8rU'/+Wl&A,UpxߨU0"C{uWw Q6?9Kճr6AlzӁs6ۊ>)E/)qm!(-;5CN]T=)9?_;m_}vFC+QI ͼW'ݼCgJ=eUG@{ԡSZzNՎt ~}97ixoMBAȾgg h&г.s٧)-8uU<~{lGg&6#ڨGZsCz9kgaVYSm# lyV  z;*~(n-ۙgn#.6ђH8k6Uy L?(K];С (1.]9G(w) }-0 e0X*TlRh֌b8e(ldq<4JSXe6|ɻ : Ν@ C9 w}MpWC6$:]I ى?$1rO'9~c{/u!jNӁ^韛Ah5ۨT;焣)=_iEx )n9Xg+QH;Ը]߃ݸ];ԹKڱ =d[d%̯w*ZOdaWq~>y{'x-s7+gi*k9j__ߟ| vƟfr&r I)PF:U%0?/"~ew~ު?_|?z}a6`Mq'i_Ƈ2>>đNFP L"hF~M|d2;5>`έ\mGinZ6MݼoLɭ菥&}ɖPO}~dkau>K+Q13qf羚 c. Q$P-Z?șVR '2R|{rO[J߸~Pgv>sǫ}wnV=$]փBMm-z)>,틿`/:GyK?#J0Kntſ{$+~ĦcʣenzLv';eV˵\7B[l 3ZRK$ )QodnOY΍e0uLhֲ}]f^YdcYq 6zWS/ HU2@a)daz(Iwl-Up z@{rfhvin)^&Q]jR5 :(BcQq2vuJ?x psf9`)a9Bb,W&+ &rk^,Kwb(\aVjU|K%+[ fbm5[kq>H soTQh@9I.Їy#_+N^emm}+'pغ1l\]EYH߮_9եKnS5Mi՗FkNLXA7:_\lk5dI֖CI#9"ZH1c(8B Tc9A3["Uh4@q\hJ"BY ϕ_LPn uQ̎u} $Yas^|hJ:D&"+Cc* ƈHnv4w[wSN8 ph -iTUNQUePLJEgo Q Q, U(FHcgs;P\$ۼ &s$n (RHp&iLZvK" 1q0κN s$%0@ UĐۛ@B+54J$:Xa@|qM `%|cLA7b#(4Z8[Jiv -|HlW}*Ƀfla7/.>]\)j{P a8:ƐZ%n&ZʰzΊ.~* ^>_K 8=~Q8ZDZul[-5 #NmWX$-u3haHHgo&$[%nX~ߒ(GQ<ꪍk q)>U:_[%{ }X ./FV@'0_2~Ihc6s1VĹ%g8EQV:!3y̱Y I,SdI IZ P} J#@,\oM^/.jѪu̗L-0):Pi#S~4tح6##OIakkYztGE94 ݬ̈8'iͅcdS@K(yww+yKWSf,n.n>՟=ʹqitOg=±9ئN;}D1 ۧG!~b0[J]2tHFA :c)~AdZ6,I̖ ୁ)R@/y_G,q; _Q]gxpNQ]^ٷw֖ά&%!a6 q$56N$Nb${Z4YHJAD(EOc y {gri2uR$_]vׂQ0/ L|,-;v]՟V/}\B4x^c:MÑS2!8 ,CTшh#½,y߇"ӸI桺盋2 W?{O6_1egvrGMvЍt UƲrg}_Quţhl"U{ƬқɴXLqy$P$.MnFĢqYxKA:xG@ F2iUR}?QI1dV?MsqŸ=ʟի.ގW*֊:@a{MC=Ȃ_*R^,IEy%ّ[MXʓ8"a7e,泱0/ 剗᪪8 iqŪXdŝkeY1-[%pplæ oKmtmgW p(}>.M+Bz}x(Ta.?*CC ӮPxFu1#>TW>CR9oݚoH^x:%Pk Of{ Ih*8UϰF%gǫMBbYbL{?!ELU,Ohmͪ,Vߺ`SҾ4(7rTa1GegllO~kuޮvuծ[=G3A9F)DXNL.N#KfLKҽcŲ{6[\FH1Xjh8å9GA-$|7 )IDLa ^-=vVq:ɮ[+8e,aKe^9RemZUZJ.1pɛt%Cyu s@(ɰFk-Ax=% $GnIcJ`C: SLpȅTF,In]z?^$wvW,EspfLNaSetǕl) zˣ~qlO/+A\UehZQ\|޿|;M[^ r)VݶN}7=gTPC٩Τ ͓ɷP]LL\'nSҽ; ] mfo`Ĩ7_\NxٓAQ`b^ٻG3/ z,-P8d GK5gӔ38-J%{:iiNlQrGvqʈ8 (VrJꡣ34BJT8Bh$I>%Fttq-.Sʻe {9Qݻ@}RiOͭw|$^BΐVЍ -&thZ{ڋƪGxcP\s{U="B u߫UUO$M;5|3f鸕XweWd=pRHLJ31/-FP_?sI # Ȼ7 wCdѝ?ɽ] vhR7.~bx kj@MS8-DLW${?#wWw߈x,9k:>S`L(qR|G{nGF\,ZkUUQVg4a&;*!sj>wݲ!8Pim3l&Egh oLdxr҄f)IT '6i9ǙIk(4=rdD8VzӔ\-=֧nljF1mӿƋxXhaS>d|l_?Ϸ J~nCz6|7XՍId!X" &:,Q)e9evseߐ'\B51t|XǐyBkBTY#db|&P 2(ڜ! Sm8o,oc>lkyjA!T3UuB^{ڻޅթ tƐ@(S-As@79ʙ$#c5SU۾Tն/o.B##wZgI8 KH+~ zWzdYt<~6K0Xc7-ⵏ)^ⵏ)^WcUb3 X NIq&"RZ)rZhlniV& vUT|mjMwzTASݿo֎pJzK^emST"_$"GWXn5٨';xXR 2d5S,*ŋn6Em{ 6uZ>pahU\TMqe$\ 熆iIYD܏@˵dV`h+uF%bQ7.]%amVt՘FIY`=lزW_] 9Nr )beL!cql”c24Nq*FZcbʎSa&1WzGxK؜_h9SߠR50TK:>"b kַ]wۛQ(gv^:$O⌺(7N,Y%wk *gD"Om,yS B4(k,¨à|Kꛯ Ú&@GyڇrF/ssOQj ,ըvJO͏@G:ݞ-73]4=d}>6ި[<ġ:8 wx24TxFPpm*\LRpf`\_>킾xV-W눕KGWUaqo./1,>F1F/?tG~$}6>؂g6*&(Xy4x*[~+] ^*`x'Sd L}pj9 ,ZZ|@j~A1'-o ؎+&^2U1PIm[~DsdHRO:`Jfn1(aNqJzJa&{v0Z`V3%,fUh=GTH@F_;*%>F$#2 'ug xfIK"Iaqmfך;g8;:H~$~…aYS=pSNG]` #`X=cs[Z` nYkY`ƶ2!<[ϛ |d3XО݋# yȒSPGʞ%RV_PΑ)VmXU-#n7-T!hs딻@A):u7?U2j7R+c3nlU}i tQ#}_) P#}:zoҶt& l|- j´xܚcjdsh9 $ \&}m/Wfj~}8QhZƵCoh!' ʢL"-RޗD.qZ,|*/G>UV'r;o"U V-@נ{,j[? RB,v%0W:Бmĥp ~ ,6D&<׏ 8`@XZMwf=P j VV:HK Pʚ>t[Ts3M;;/.])ZJ@:s"~+SS`+ݠpme0,;A^=P-#0rUVN%*f4ݿṭV|@aegF~+6m&*g%x#.SYR"fM&IEj"dir F֦9\d!4M#)Y 8ֱ.:_O~&m$FEFE/[^)0tsL4faOoS@ͬ\RɍvMfٞ@G|)׀.6YM8L0 K]\rl,qĒQ'ZrHʭ&,GI>&v;e{MBht2M32Bs*m,sX1ZH2;h:T['+1]8W )FɱZ)&KAzn!'%(ܯT!偀!BH;̋t<^5^bEsر/z )aRYs)V"F&̉FM p ??b-$ꦽCsvdLŚƪ'b ol|s{ FaXd8.§aӻ?5:vMn-P Z%wT 5B,B SqLB#=ֳ4L>0~o;+QӳtB)GUM:b֧A`?~ݤ-Ir ڠvHQALTٲFԺ ]fSz_U^&-n{}u A|F@W=$H#|!Wb("JKg IK_ `(T蘲YbuPhBna>Q6KYb6R&=k%} -8$@ [ȗ =FT~ϼEm<9 VDC665(P DgTwۖڴ06_ιET9*K RodjlT' bGgɝ'MQaSױXT\?ID0pC>Z܂s:P>ÅD£_0%@gOA"iSAv%*Y{L[hDm$zbtBKdNL82VC(jw6Ľۛ듏 *ҫlpz: SWՊa݉bAșCwȍM➼R53Ȱ44j23?X^N3_;0w'OgHp@M$3vsJWz̪.\wE;c-d^x|5+f,?Zp.Զ*9Q1[;Yt0xUkIw=Q{G--hbuKSL T{8I{'%`λ6gMk K޼{& S y"|RDSR,zVI4Iד;3ߝ?kWn ֡?~@Pmi _t!e Zw-)+'V~naJ,H jz'Yo֣НשVKVqQl16W#*7-7Kj6xDˡ  *';ؾ+PѫװDYd͓OoRqM uS;{䥋eikp s5Я>n Ph5 m]drRYb|7s2NYqsp^W]so Ξ~k'*v+NjP/HQRbL,Rۏʳٗ(ԲY>u>A'_0 OH\Z%Y :6NE׉F{ ޜKjߎg7Շh==V?DdlM)^ ҨT[ N7zi;ymtާ#;zNxx;rABR{9h^5ACs/5@CߨG[/DkM *(fA1P_vj 5s*օʊOZVnߑF׍thp0mT2~'k<|5yj ޏw~|]i8}vrе?Q1B2ܳTjwvgBvR 7XykO+9[3w6{M )@hOG$t:qP{7toS(-o?l&5APX$f_bHة 8!0޽W k\ܧ{E7j^fԣ.|]k֑|{tRىVW J},D:"R)q)=a)CNP_J  -8B"P{!?*luTk-ݘHC,xeOK BJh̝r%5;]POB<U+B\!索TjhE0![kEg], u# .qM:VYQRϳ_NE@#~.FjuϪ&L6` r6Pz#NgZ soaK'-Qf *m^q*Bez>С6C;K|Q#RR^g vs@< V$L!Y9e 8(ǰޚ:fa R3+y0c]&եLJeP<0E˭_{J%Y1eb yJܰV{`lgqkp?:n>ԅ^~2ƨLjsuu㗉a`j*T2He:4Xyb8ŧ6C]&\*Qfo r4*j8McG6_qw"$puw4ѩPH2t 31)0ֈS5vچH|%˞,-* 1ʆ8c@@{T Aфsp;{ng0*B¨5y]8 n.m(d,N7 7n 696`*2i 37v3'qpFJRkpuFb_]0\!̱#(r0w:Po ah"ˎܝBD .ܽ*ar=C AE 0Li}g&֣xyntk`?Ǔ 6?jr'_.:Y{g흞wzgmom@Kmaz:2$NR C0!&tCs6}m@4Tdj;е7#'SQB}Vi-eNr9:;ZE9wo[Q4w+#TSh,t#|a WtHr©lPBDՉne.B"GɓbZ)OJPIjroGVﯸQޚfр T lEڸ|`ieDzDwZy_ f5u_ܴX+Wzʫ?Og۟~Y󆳔P$ڼpn__),yD[4 l\0@z۫і7D)7#[hX\fy㯺3%j6Skh!j2TҚs0 F#7גP])FJԜPb+V btm(Uqda6Ԋϼd+?K[_8Bv+/RbŠhK5:޽a}`:_)~˚)u,^8JSڲߵSNݥ5_%Ur3!H>IWWx >p&ɽn8oaH;Uhj-zꏲoC۝_nZBix~oR~f8 ٶƠ1.|h2(v\l \B :4k8@VSKifEr Ps cW*/'t>"IwY] /guՙB>D j\DP0I.Q@8-gu5'k:+.0C6+H(1 >}29%njCF QH2[QaGB΋w٣VMp꧒;e (NYDp,御D GHxQ%D HCXi0!pH{j¹D5|Vk J?]&?]h D4&;HCR" S#BPBT  "/Ai^n5LM&S N٩bH`@E0$8pZbA/W(BR&rrUAydSVZScS%3㑝/XdQ& BEz?%X)#QLvjD`fDɽ ڥ Or@hOHhE1$T(HfRUOmj Q{+uʹ jmz:nGI@C-N?@684b0DM0D=cTs8Zrgi?!V1;Njb>s9|x/Y 5<4m/O Ys\xJڔK҉4mQnd)UCYFi:S؁ 䗘=ݚț r=Ӌ%5#T3Pd](E ҭ''>\98_,`$ 3gqEp246UDѿNAV@x+ŃkuCB !>u!j n?UA3o͡Ʈ^r̴ 6=>zr:Xxxik!YɺP`n=6ـy*6XcCX3!&23 ͗`6(QII$k4'qXDQiN8ilogcJO%Y$n9J0zgAQ G3z,[o|z6*&ElVw}5\yY+Igݺp>hg缣KYeg,obMj~(DoVdyX1Fl+m(D3?x{s z tJb{Ӛ)yu8ڍ||=YŜ^Ó%BM'j0gX. ZtYIvEfK ;Wr]DoҲD!Qlݤ֑Dۊ<֗[yH+>-`^Zu;4}zn)O/+J 69 fo/|8B%, (x:#Kdz*nFF|.a30rQtZ a$tDaG`,Z04Z2b%.r@2uipv!κmKP0kFYkgP@9F"לI:PㄲsB40ֈ qc#FZ?*g,6H)ukx,92seŭ]|zWc :(ٹаd4*ܨfRAGsco:7V_w @x+!vM rl6BlJE)% {-K Q8xȢQZP4W(UŘv\Bs%X`}-X,mHAJ -WKaD\0kHԘ@kLr2F.o_5uh=PppAm$XY8ƔxC 38 6Qa3Pt6\:֝7rz2jp_wqy%6|?^Y"ΈCqn<ȴ ۙ ZΖ>8~RspKAާGD!&+YRrU'aQj#% y*ZSDoY7Eu+AꔎQǺ %VAnńZ64䙫hN!'qj#;q{o;s؜@w6FrƀީTs~uӀkd*Nńz64䙫hN);(̚cAƄӯ Q#{ճYLƇ Smߏ /e3]$G1j͖#D;9V_s]aslo'=c fYpv 1&fi֎Qkugכhh y>n<8?lS~B# oԚjir0.PAY.&8cT"GHJM VksfbK)rJǨz7`Tw) y*ZS?ضnJK)rTt:mN7VLhukCCTcuㄲ1XRNuېYh֭\ֆk- mq@\8T a(rTSyAaTBrxt~+P'XX{oobŠ'My AÔ2E+%JíTYƖ(g! ! &)zJms0VC&o, -2Lzm9 ԴQSH:C"!J|z=!s8lAYWь,! '?+N a)ɴr B%64䙫hN51߲n Sƃz+'BB5䙫hNq5϶uj?<+々W,ȨV!"SH0(.X>h.UOΔ37s3snN2bfc- ') pwLUwV§ܸE(nNpt (۷?q- C,3-%z]Pea©أP2C,ĒA<@3c pTfB附"Ql4f۴H$[ꓰ-\ x!\APt",7\ qџjU$\ Wr! WVEiԅEPz'bXܟRŸR*@CVEMTotŷcC:vt4>ȋWX&V0UPvHN(KVb/KPզ->S3xp $w${!Y.&WۋOԇĆUg䇰X.L<|\h;R@!z`< nBumAY]^arstaJt^울p'S6'$ԞnY?=|7vΜ>Ĕ ]R0?Rb#ROWD>S2rnT0bx!*r^Nqs^6V.gyo++(_v\ O,&'-"IscW0䢍}uu]\}ǃYj$|Q{ڜdc{XԷԽDוb#ټlhڑ ^oy@r';,tCXr!94ge9ؤD| ë&>| ohx"ibC缡 )mC缡xUU#nTOVӂ⪆  [ƻ/BtSo_[8uQ|9:D#: e+Wb6%)C^APiG<3M[@7gGݬU\SPVC4Ύ 3 IJR4nk/6,[*ic}OpLWJqYu[ 0X5I'½_k_o^r-Kqxu::sӞ¯7k]Z+ZyʻZYgSb8*1G"psQI0! B8"LiYy>IPdK{'&AbnI+g nx6fx Y}8s{w,-#Mm߭>5hw|ϮއU[nd tAa-bAys҇JrKp<8` Pshƪ}Ÿm~d0oefEh= !D2 E12(4e i޵5q,翂K*9g_TQ^,%OY%'/NXs` SYFbw\VQb%11:g].h.F1!U}Ŷڇ[PeE提9sP\l!|-ףS QabJA!Xb|2FLrC\@4>Qp .9gf .ˑ>>?Ũcvir֙,m+}li-j5.8BH)WV7JOW\Jywg m]^mC-!YhHH?>&dȀ^xE:Eͫ$WI4h^UE" ,ʈG˕a6 bUJ \4`U6#N5QK3֚pQw4zsUNҀa/q" saXٝ i=ScuԞZ7Xzm|ho=|060R9ݝ6oXG ۜ^=ZLmkt/#߈&3DN3m \ЦFÄҦԤw!z_W\C 8j7Y]vL"rt&/A;xRbuv߫>n0!')/HWMU3 ~.WC)Ϡ#TG$3 ))>hT]6͊gV6@yWg[VF/] a6FJC" V{͝1BUT"g=R@%j',dw)-#R) r#Ҟ9%&ħ}(/ъ2 R|\#!cD'  NeRsA[O[hRIMq=% Då';j"F4^9W!1?jL9/}`\K gU#a`F)^~:/ BIdKfmGO,Lp(O?x[W,їn_|B#5aʏ~\AD[O0tjLQ( -i4&aTM\s,׍ `8QѲ!n{@JrQI%ܱ2 ;dN0!uGt`I tH$2iMJpP5 08"ސ8!#-| lY>)}9%֧2؉|U*f<0l8Ր 6<<0^ Oڦ`N L7 -ںqpm>bq? !|=JQ ~EqlL?N[Ke"O~N !"bRjn@Ί#L8M#SYkbh- &p{+Hr ?WDZ[pK ,yF8C666p sij HׁHc}|&6ĉP Dlbv;A"DɼKd%ҮJ%YJZ F0>mkI5cq&\ߌ?=.z#qd*+O.|7f0n\i׋/KoK0_/ {Sw[},FMMV&(WUq {d+Kx1JC9X[^,زBk7;q!TaQ0 KPa).X%w)UX9nsi =g>rc2܍f5Xr9MA\U"R W}483y\w @PºZf!LWG 6YjR*2%GvU "vZ W_{0%m8L*g&)*"kPڰsЃa[]3u ɽs+FVΘM`0두G}xzZ ƽP"FD؈&#rFcpQEFpĹR;{|gߦXk0RTċ ,lT7(Gzz)A>QKs7)d\cX7hΉ֪U­/jRK Qm,Ls?v_ҝzVi{?oBZBG7i=p`{LߍftJty e|s3mN,I$VwpZe/"<)օ\(ZZcqhsF9m=gԵ tm݌_R<g] ms \*EN"O՚C Zj"v:0)4ױ&uo8=+F?Fs0ܛ$\Uϫ$WI>Ym/-S)GD!8DKh܂.0ay``n/tF%&8SӍ&@e@79:+ǿiSGbS>a~p~mZx])=f'^oP1co9]I`ÓBq͎jr\wB'(>qA񂲹3 3 }*bW[gHc63t֧zByR'+wT!ǚqnwAWYpy0 #B\ G T= 6Z?Ttr4U;9Vh+ Zmپ$餻ƠzO3J /gˋ9a}rGJ!0+fa=jOw06öuxw3;kt"0<-ق_{ 9V{; ׊-TY aj9Nw!>կLpXn8?eYo*DI-)g+4w:W>Ptt3Co0J_r4Yl:''֘xi ![FJ]3 }8~fJtR+"-hmf|9i=?/iZGFE0LxRT ,[Po/.r8/9o&^83dخi:K%>4$sEH "G VTQG+DPBk4 vwDx_zHw"(o8H4ZAjl(p("LW):ծt=kzSY]]Z Ք$ ĽMa`0uS·OS?R>}|k~Y823i:0cgldgfVo=|C8g9^u[;iYwq֐ ޽QFzg pك~=+nbRً<qZw/8{ɠSwd%E &0z+Dʾ2o A#.5І"xgPPJv*ߎ27s;5G9m3FTv2nnmS0K:wt&̞“nd.sq n'[*#%_4vԸ_z0s]P0:E9N18GFv nm7Dc8N Kf!7}(%UP1t?Q#K^ |Z 6lj+jol\ԄjžCmP)$9>Nm'7#ӈo۰H9oTr)zr1k?~5Ayb?|A*УGhך(o_*i/שոrU[3'˥ǍoVٗܯ`p+UR%2/@bd2<To^6oV~:iP}3>,P_:xCru[UtiI{/@4ED 6Vz gN X1~wPK]an0d)k?ac\T+#\3zbdc[.ERS3\ֺm6BuM^7(`ª G$~3<c=kµѶr(q5-'89J87˥grnbڋ_8RҎ%x]]-I^R[<銏WO`-jGna{K"!Z:(-*tWZx#A}t߆jJ[r^`&C"Lyh(&Zj/>8@KH1`͟z0>3| T,a8L':libj_/Hs f- Y=aF#xaE$֦@ wnxP9wvg{(6FY'j`7Ø(H1*w(p +JDaF*$qW*9cygVi-:ǥ@#"R. cR!d<)Lq I&υ=۟?=hN]gzǍ_i9HLRsRKni}}H9+Cs\lݺww}J8՝UJΞۗOʊi_W!4 :`V0{ *7i] |׉3%;٥{(11W^|:`B&Z:Rp!Y\JcVm:]25{2'F`4R4튵 $?T!+dezcƍ8@Yw#Z:7Pz],4Qxqy-8]4QURrshGjp x;MB`UxzA\Ec$0*ں'(ΘRɎ<9uH /lApt=Ѐ*O)O8Iɪ&dԨSs*O٪1V  PDe>s]F[f Zŕ!Qf% ?0H1JGX & V0s;W_jEjV|[k"1C1&",l9x{幊`^J(FYS >oʹĘz `*7*:40`:MPsaiX8U^^*edF2@`k$* @j1f/'j%}T #1-$IXȦ䴟X.ϧ]_ I,J6"9W]OmԲ޶σ[eվpH h],h9զ4Զl]vT|1cRa$y+c[ (!+ ۰T1<° ;v}fpNk+ya"T5CΓ`PiaBSX=hiwAgKCATht^g&.H/&n. +gשo68xF@V*aN>@ 5"]` 뽲b)mm"r˅"HbDkF@#ލw9]O rBD)bYH.Cq/εAE[uvSiסyDA3壈CH#cn^|r=&m^ bz6i3I4n6S''mILt#F%Ƙ6\,MbK@sͥzc֣k58=4}(`1{Ž[}y`Z40Lb:;gLIMi1jh(.AWXT!ۈ4cSp4&$ARApD~9TL &yGiQQ2C L 1"aPWWn[[=<vJ&u)oDk]OxFO:LF O~<vtל吃`@p!We+$o 'Wp_ulFr 뾂{IK /.#tn|4dztmv`(zElU<Jw~*I wyKVASPXPVHylJq03e1D€`J%a`܇XI7cyyXVW g jww}k=~ZC7Ke4ʂHS#[e4s-<)A3;`OcP܊^^ P(*?W;s? DUh..Ct"BJažnBbVϻ&Th0F0@bHyو 9WFP5Z .KU#0m ɭj}BE j<yc(1!՜%yQ3Q vBy 2 3H<6' BHL q 0DČ!S%r5H *rcXLk^[ }G Rh-1Xv4W:nWe}-kwe^ܯ"FG!)1beMVC48l$t3B6Lύ vK F; q=Nb "`](oTr.qN{(?Bj G"h#e"A &{ E=,XfM,=e]A9Wg5CΑ@DS$Hc(bRvބdqQa u%ܘ seo|ZXÇ7!ki>|\[u(8DC w``wlzZWB6 *}Vx\< MТHSVtᠣJuߝJ}H˰!qH8w"ge"!h#S98G4VTTn鑊V8B Ȫb J1é= .J48IdZMzY#0d\՗?K |/J "*0Df4^_׎Q>xaqH؋O')fI0b0*N0~{ I&I{5~y"!Ƹ2Vȝ^&ìY >Vq܊XdxWb=e?o7x٣@5Jv #j,.'y}򘗇2ҳ;Ρ>zss]6n "\ݶ{,g)'[F ;.˗I׼cԐ|ew޳ 1Fvǭ+F wFLxv>xa^9^ LR>vǎXi\L 6%mMޒVJv_S:=%򱅃hzꝹT1l+G9 پ}X%ar죖fSA%(XsG̵Zcd!r5={^kk׿޲ׁa i0up4#g.=а٥p0eqe10\T.O[OM-Q zgPu.K 7_##D^uRʩ:HSU8H3j _^I5t4OM"0G&"IԼ9rEYlY3MlkDbNȱÿ\J,Ekaz+({ls'0/"Ș?VEHuKƆ!̀fEFbG*ؗ ~ ZZ/#8a*evw-pMrΪgYPJrB,{Jb`Wӳ3:gʴ:_9gdCd7f]Sy0u7'CLdLyiMn% < ~"Sκ{ZS07&2XtyS5%(4mSH{*"ou_?W-Vބ ࿬~)2=*30pL1`fL+5@Io6MGͩ 1s=Jr-R1ӑrC 9ft|v:}:BnJ!kG$Й3hsKk5ϤOQU0L^r\I+|t6@îDwrrSU:;ওzdShI1 wS҈B&a5sˊ//2{<-狏7q~ #޽;tzO?/?(~_aт- (*S^O ը_#6p\*jj]96J O_~O g3^ͮ?6P~q.FȝWQCJ;ˈ *J4ۛ=w raS> Bx 畠 1UZo_y?]Rc﷈}?/ߺ<qk JK0._ç=Q0Hp"V6{WƑJ/  m\ǙQ jZ".^0ԢjI"[dשuNUFa&*oQ M"(RSl:ÕmkQ vZTTiܷ?B9:,:ֈ2FN1U/S?B")#;;ӊnOTՈx{s$P$C) y$10%`UΛ@ Qa xz`mA9"n>P*Z9x n c9&CTQ6;]z !8iLwA jq 1ch ՚"L:#!@gZ  |*hpD NO1 BA[a{Á2yNFSph *P  Hڔ-C`<7ֲ q# ` \G'uK-l%K-j HSar@YPVJETCLNA"cw3 +ìW f@IkrATADFr` Up@qKS࿊Lh%!ܗ,*O^φ <0ԌpfTo#?(^W,fŞW6dGYYR sC<_v|S{_vp_/;ïrp)q~bX` 7N硘]pt.Nػh~dKۍ-]1Œf,FcưFȉBGB1.E81r[PI%ޮ$M}o֦^6ɜl﹫%ysŹk >0Fxp5.$)My 0iwp؆}Z)6.&b7.T-.l\L1=ظÍivhq1m5'h7.Ŕ}_/nWv:ظvsVbŌEW:BYk6Cl\QxN$8 *$|dP+ZӊC("1<3S^p?> ~}Ŝ9<и!L]"-v|!P)7Z1fTgS OP9ϧx>qgPLV{J'P\Oy[RxsM. l7ZEQnN5܇ͨ&7!*='kQnN5~!TC 6!١TsFhC?4mN5LUr8gZ/Ғ`F!ͯ| ?m3^h9⬗Z,Θ<NW<:$.Pfffi'b2͆inaa -^&ڥIyCq\~H!q{7_b4 ?,^ud c,(Ozn R4v&N<5ʗIz'擤%#x? + kFB\DdJɝOX7ΰ떋Ac^x֬[~ukBB\DdJ_hxhݤ@;a21%:ceZ[ܬ&0-Ƭ՘Ƭ}BWc>_Y-w՘ƬYc՘Մ۬1kEyWcj̍jzj!.sWcnP@6}C5fft5%MP5eJ SX3  a8N/sf2M?vWfǽfr@GB0WK`W~T!V:'F>CSϽ"1q+?J/}DpyJP΍<_߽gϋ!“XqouNG9_Y,0՘!RӪ,BˉToX"JǭDm!xTYVU-Ffhޟƅ&wA ]n! @p;P, +e傣0.A8'bK2d$(qBB2 @8p4 (hNnFt0JaT ۴n"Ep{(E'êe#g~~0IWYe0m5 CpZ)@kBD@*B p HIl<$hND6 a 0o`1Hm)ɂP" @CS" .i9Vab@\($_44"]X`΄P}1XԐrC%Js 630.zHB&RTcs +Ժ45&@gƃ1R E<1 0Bq8  U3%>nl(p1Ck3!Pd} jfh@&w&c!shR*׃TU (8W p] ^PFR'=k &#aT  5!ƄtB~r+ą㞹MӃxbޅ~z]~A^fߒ7 /˴e/Ԍah̨aaQ&)Fc1%ۿLTsL]3 2"qa t\/ B98|^<)olد;B9g w7e*#Nk`%& nfEmUؽ5R=i6KQ= c w09w㫫hiPA倡3Gi&9l&_#ԇťtڿ|ގKs/?ăz2NDqӧ>cmQPYi&z*|IO'i:c}}[aG|uj4?.Y?7mBJ#qeqޝB*E]QF1JqVi z o&W»7O*moО M k^9(ytYSƹ'ЖJ);$kWW5NS$P:scN@p,2bm[< y;z?4@iq->9Va92JZn6XJRGId{.=Hc=ªTgϋѸxbaM *38 3wTY\tA?fZ|'30^5/Ys1qQ&. vېyPM)ѣL2#dKEɔ.L;'&RdZr^xVL?Cz0z>b~x} z}gde'iia3_e:Bξhs9{#^mN{zL|UX (VXo,֩ILeh "Lh1x!Pb 5ar~kAX}RB=A lx+0鴷>mh#Lֻ԰y*ĤSiQj|eSZ}-?]zo"R 6Ǫ$pۿӓ*(9W0ADu8+p2-Qj,E_=/߹4MsAB2'QjPoWPxZjc\]b+2cFPC.}ަr>U=m`- 5ݑ|t>]Z3Է[Nč$sMz[N(52)]":,ꀷteⷹۆjjſ<[Jv[kO|@ Dtg5gJXwXw"l]jvߧZcșzyq/&H*.OӯeT ߾0=>IϓFW5zy7ʫ #&ro#sO;+=! +Dѫ>ŝ)'?-m7Onl@ϋ7 R0m}vV}KL2Ԝzr42^srhlH$)2B$ej5RYakl8sAP1۲sx6~SCS(OӕVKqG{%1׏J]Lwj~ۛr%T}5`ܰV"'A^Csk0u>ڛ׌X0Y :gnakM?@އ Fա+#חwXǡ\yi4SvHxY;mnǣ"]X4$k\:ˑ:8_@4!GOZ?u8eGnQi֎e9|l{gngJY!ZQl ZuvkS.ـ.igkIsUY|]=1U(K-DLW;?{ȑB_ 0~Tl>$v b0)iW(wzDeggzE~j-RL]T)SRN9)ߛJE@NMN'c?Xߑh )'.qf +$VʀRDJ E9z>LBfTpz.EX=IA$#$|gdFq,y! +]۲@m03ϻeyd>9YD&㬜+zf'ghxWs) S”ffE$Ϋg~X/aSUT+hv;.u`s2]q_VYe/ RD=psʠ=WU/|k%u ::> KcĨ;6RjgyVΎ9N( ׃u"uX+$@} G:S9*zA}NVjlQ}`ԣ*DTN8VUo,mX4o4CPMAnȋ,Y"/D^4%["MZRc$Z`7VhD&* DX-|qwunu5鐅,|Y;)Oe6pEUG铌)p400CbmlAIV@b_;!ZIpf%@@)IƓ(:i>0 EPP-c8hJ0Cu5'/aܾgVv! pz <6x yxAv.)pf(',j[?Q*zb¢Õ՝j(ah#a JBZgn]֊l(1zgN]i=ݤ# ayI#^GHYKj ^"KIBAt֊<"lTqY#lv^Z2L&P[~#e\FHRGV&GsĢ`Θ<6."L W= æ3eIk-5O9}Ol~'L+yYç Îb63SQ>$++{Hf"oAжxHf)ϼ6&Bh;!P|CXoڎS_Tcݾ$ S 5 "ˁ 92BA;gNjkeޢZ/깶[־.t_ )p4ozx(nzGp;4=l(AvYUxճ()􏉔s[ݢ6._5Ձ9=4l0-؉GMmKaұ}_k\J:hΫ[[\Zʦ85ۺe޻ş_sY^7[& V.=4Vly ׯ6]Uto2~u?]g+o~։7@}N_hK/M)u4wgAu҇;8 3( wJ2 ;8D:]g4D x0ʘ-S.&65N V bs.r,ŜFE[.fMv.3LU[E/p!PJuib %J*7x.&Vu YmL!5\Vl36ů :JٵG]kADu%fviE.Jⱝ]8|CLlK Sҵl yJi@3WKA'Q;ms/27~\EQS"ԋx7uhC.ǧ W_0f@/cuBςUcIq]dZy=­Fʩd䢔Sp̴E͙2xo4OJGI*sIIv̱}lS0[X.O;[7wlZ|>I P+gIJV]Gh~o@T2KDNF+at@9 9Aے}ʀhˇ? O2;8a1r Ύ93Ch>PBxw>*uOƱ[; k-j$(3pg^CD9d2di؅dŴ8Bw%Xme.2ܖ.T3ƂbDP0@${=w EDS+T0~QE6-Xn&jW2k~p9о=Cq~ ٱlRw?{8.~Ow;ߺ۷zgo9 e/>l|o̿\W$9∮KO|Te^s:0/[wydw BXr@'ʌ+fF-jZ9g,4o&p,׺sPRZ?CT@A?/=8 /'pQe\L.{ oJfA@S [\a~:w4 0Gr(Rfci\ǹymst˞c(_dpQFFOa-fR FSc^/kڧUȲ%0&8u^/EaG\?MD?q: JՁ|vOK;>)D57<&c(}cX A3Ɛt cZa1&Z lP-U'p}tZc@{S;(}gT?rV| dR1 L c ʰyN EZ>%JmI9&l ý˵KUriU.K+Ej&Fkނ (7B1EtRG*@q6^jHaȀch . R…93IH^k*,ؐ^YI"K NDف}x$DAGI|mdU 5m̆ (^}7(mRjyqo흽\(I}__p.e.~dsN&}'Oo}=͐&¿.c]"pn1]W,4Bh[‰׸;љ@/JL:j- h䷼_j染?_욤NîO~ѝ|W [z6sNW1i~ur:O~hg-!2&1E .rLpь VZ(]`B;(X߈&9s}1 ZK.L27Q tlj thppx;)A <] E" !:puFsN95ZIHE*ݒ?ШSg$H2e w7[dQ۾NvR8 T}*KҾE } Ў!vR0hxet &֤_HoB }!?Uw>;eXHE& ֆ;_1c'GkPzĔkEgt/gg 9qsG8b9r?8|`Jv(lG%_RȎW~d^IbDDndv.J3;vNA2TzB`jCca$n/d\OQ8b<&$2^YP`FXN*'Cھrn3l4' nyJ- k"a!a29^\7T ZbӉPMқA0赝/6&eI}Ah )tph:aaa cZ'A3Sw۰LwF/m-ZEe,hݐ1& )XG g$Tf@s%NR^-ek,WZ!g|Ϯr@W~ޟ qe pWp^ Wɝ̟ӤY /[5>Cuu ԁ229t!R$w&8cEƜ8^O"߳.,k?vFMo"N5W-=tSN``CϧLH) b2DJfQd8g#|Ԡ?§޵6r#"ee/if,{`2$3E)$Ւ,Ҍ2cͮX,Z`žB/6'EoT^zз  wT1%ĭ\8A5ĦI%1bD\M&Ha'ߢ.׉6[T )Ն)#UPt3554*?R@pG<DDSڌޚAfƓaܛZo3w3YW51]7$)ם\OVS bSiaoe{*56|۞ps?]{omX$Y,G=h ~(];8`1w7fO&HKQhjÚ{v2^*JC}:>E#]"=P·j>5 j u {J4,t]9>oU}r~w0A)Ɩ94[|}Gl: ~%=N?2(P{9o pw7kT(JP/@kО*59*hNUpZRm`Pb87 =b< *Xy;~G~ۖpX51 z3:ks_2+ "Y(GĨŔ97!]Q̤k<7LH JsBKJ8o.8(*D[ !zpAhaRkA+pKZrJNߠZV2b)=p2W= tr;X0 C=pzo K_r<7 Wv-. E Ͼ̯:|S?ǻ5xtMM|t '>Qv傹ZlJ]Y1k& W(zAw_%8~]k)Al]ݳz6JR3-g{<9Tasg'=QЎvO 2dz}=|CHh튷() H%B,d=dkX3rvp*i^0PЃRY^WhKXjrdsrOz+,WX{rU?fϛGkې0a~9\]-8t?Z],B1֟(>D~d"'e=û8do[~.+X$ɄB*c1 39AYN>W:1I$G?{i a9XrYi$DVl0 堸 E5ȞڰW;UY\ 47$z߲p0j i]C#l;V~Zg,Y gdˢ̚}8/ ̭ù:I˻w5,4 2wg=&a~*' 6^,B?eᝡ=>C0Y\4fosŎS8?e{hyK\eDz \ûUR'k(06Zu1NCo7N|w-$LyUiHq)-.-4E2n*X“Y+NíT$m Rv;uwWkMkT/H-uF7f_<ݻ4Rce1#:s,,}7s UQIR퓪,=~:C֯9,Kp8"J;ᢙ ΔR_IOz @54SZn͟[k%}u$Ӝ[3^X׺ y6/$% *r买N垪9Uh4jf,éDv _Lq=W4E+Fx`_:̥ZQ1)|6TY,C s%4nnH]"Q#󿒐+FzkqsHh&h,;G{Jr43;!E=Dz1B}=9溩t.M}GG\c43ust*nU4A8HMPJsЕxsQwbWu+<'M3xiFE M}Ama RP~:%:K#CN-u*^ ׈J?1pMN [esNIDW{vAtN&tb5ôŎe[1/5˶Fn~?qgl2K)}&"Ba-=َw|YIyMxYF&O8 U_">?5"++B6) obWBARˋ\w+ y*UU?#$;'scӎ !(-{ B2V6OR o_ 2ذp"EoC ec~AZ \tIYG*z8:fńSQ\G>8G{M{|ceQ<q[jH#۰kl.Ms[p+4 6iy;H3jޭٽ0VBS+5e0AB$YKʱTx ŷi6@"ed8SHVSz>O@J֍D QɾYو9ʙ+cqAF)\8TP/eMs T9 #|AAy5ʖy]X$lDdHw~3!CGm+7!+M:2:**8oYeplтVITkB/ u4I}y$8|[73Hjs.f~ˬ$s\[NY!++]5m/E U >=ߐMieq,y2o*an*Xbwse^E_|ܶ EY^ Wæ8*uS7!x\ez6˙&V4b%;ڞp3m+g8n?ly ÉvDG#AFj8,F \m-1ׇc# e` p~?mc05e$G8T߆e:&ݯH%eNɭj=hvfZ}$$iuܗuF\L+5&5`pFNaڬ?j-)!Q];y0aT׮=9j0{ +{];(?~+WMaC_HLpSfyD7fqytƏ7㑝]pnz|58u O~:k[m yd0sb;f ]ݣK4V<-Kn]\E]ua[gݹ޸i}N_Hb+Lg}a&=ayם/g'7.ud"j(#jNwTnGw%j&$䍋h-§oi7J#jNwTnGܥO5L? ݚ7.hG|vZzNDqz7&JӨ8;X52f|5WW_yR`j8y*1ӫߚ+y됖g t qIMif9oQ嶣ҵ A4zXrExXY>]fF ]>%>"3-z>ϯdҿ>y#Ma%㣳d.KtIuȒç%tVɪSX6#f&A* ?9̊1ŴOg(*h֍ $ oGނLyJɎII6SXp?!ľ+%л8QLŴl84(V:B6`AWq}h-\oߛS w_B5ݬ}aX./ӌ3WɥWP)T<˙B#g*Vh.2jEQ(u SCPתC~h7j޶),sHq*\xY'.^-6%R^}&nU$zaI h9 Y`0qν5/|věszwCCJh9K"tÇSH{ 8;6೭87*/cƑn2b1y 0 .Nq a'OیQ->RXEkDXE U.>xYJ*" \WQ`MܨF#9nr2J81?A mF_ت5(̊lm8*ْd`=@6{^hmܙ)@@x#.7Al;Ś*y)c(2]v:$$Ą=!_N_HTlR̡ 2A 9 oEzj9U; -\eBR1!y46dP"ʴ7lJ v6[s@(ҧ_MF}).Maݳ} }wI_H^|aYPx:N/ŝ.H\+;aĘv4jxĠKTtF(6! |;{f? g^ϮE嫠&!yMW?`.5FZ"H vhk ( `pcCQj!z o!]%X=.}iq6ɬt?Cӡ~7N]qc 1p,l2n]Bи"z4/")ERsjEV /ekӶ\0)?mh0!9 L}DKu =6|(eEԤ0nuYsb؄(]O[$;ږ} iHgc6MI'r}:D.{LDl5ESU??=rƺ:kޗR(wE߿rKh7H DAh ?[,"yȃ֪i T2˜^Is`4f.|EFDRrD<]* &?3u% QyKl!_kK&b -SGn~n.U5.gFq9_5˟0|~_LLLL-3q>3$NP:0Ese 2Br5N޿l~-5v=󦋾kzu_tݛ8fmtRy/w_~8SA-gkBgZ#MyR`>Ē"AsQ̂>X․n9 U0JІ1{6`R:9A#BXXzWzRKA 3B1L-@hāVA_^sV|x 繹oqVCL<4]EV0 ZSPAZHt1AIt1`抓#/fENpcZ8wTmǰH 1rQjVK.rrkULb1:zD0PFK꠰(w27Ec0y),جsۭ[`Q:<|7ϖF)s.j3Bizw`}v/˧>-(~#:jlL]f,]ܕ(`;'qTgS8_mV@5弴ӚX`݉Z#_R!8g6ʐ `)10 AX<L'׆!,kjIFw6%ۨ+Ehd1J#ekE;FGE*D '+#:䧣Aݛ2:2L)8ç@TnEg+Q~tdiY,prkP&dTV\bwYiE>J)C!` V]6=%5xp ʯal*D(9eO߇gش/YPŻjwL5hgΖ&~gRLP 20(y!-k?[h9/撣ܐ)K&Pu*ʰ"0"Ey #:n PL.40acqV&lE)Ԟ]S\qPq H=py=tΝ%FZG29M4J KncK*n_Ľ2 VxkCWCG^ܒWvKʩ@T]LCRYe/;wTsu把ogIx4ں_KN$KeZ̜Î$ᠪeRRsPA rU2p2K$z7wg6:@cqsS<#&]u@be$z&!ѳ 4=[o1dS\ qIsFBy9QMR*+\,CfdFՠ dAV%A=) f-)EB}͙gz#ɍ_zd0xX3xB֌.ǀJKb3JLRI2Hҍm8ˡ\'&Ƃ""mdE BPkTB*T L__2bMjڹHȹʡuԌ[¥w .)3%h3^x$\H:H d\l0I=C'Nm>d$ޫGHĔ~{"d5t< Hڍ 6mt.Wa+V$qVJ!FGL*RQ3Lު;B #m%})4Zx r1[X`pQ{4. 9I3$́nýEJJ1Co)*{gJ"MKK{f- wƂlP'VGƜnF*>\WkYА-m:2_gBEl"1@҃|zU>V(#m47!1l0yNW- T$OA%a4LFV1IzL|HkaQx/\U4C>i^*%Zg t6 BKe6ϧ?+8ZzWQKN/ҀP|9}Y~SK:pRS4%1>2^j<+|̔ f60d6{b bYIIN P2I$JȷLDV kFH-hGbRx2̈IGJ:}-nAb"jH4Wfvr2"|AƻSvOWZZk5dF#Է;Hk#s<@*^h%JN|]??D?e }9]WWcpj {\<Гh >V6?=:_/Jzbb(N`|?+IؼXXAds+EfӠ9Ѻ8A`l5Fw#0 oR| .בvF~Y+p[ѮL̶gXJ{?}O3l]&vz:qDw5y5U>%omdeI>0'KKx{KnӍAf'}6~y_?\1^?its3W`13 ]~yw?r/3; Ya3!̘+kBmsO)޷;=q7΢UTK$xڭ%S;FG i…ڭYZ ‡qk7!hZN]>wRx[}B|,3OKw"\M(ۊ"cЄ4YUv0RJ;89ZJ;^XDF zQ(zx[bܾb:$P0^4^4zt7Oh.k6jp:t"`|!n9^`G+vz/`Ѝҭ[< kl=oOί/n@zuUSo~ ׫s3[k]5nQYzM>z9b>f$g&sWO'O) AY9yY|.Fx>7gR/G?7o'K90vL5(h8#xv/M0sp?I?]]bGZ%/{`l:Z4U`zeK+sl_qpqQa@|k.58lGݗ n!罁BxJ¡Tp%`$M6*d7 l=^޼@J鳺m.{1K_'UOjn[;m\֞sK2Vc#L}vx{}}ǔa)oꢵmlfb]NՑ]Y0hb~MAUV`Hؤq^MMFneםܺ-G(AMv7S$?QLuqM]@x ][sTLڷ-,:-Ĕ1ZI eYzVz9y(O|kd`Gǵ}A1Vץ4׸mp;R. N#HgO,b0zb/62q^ml7{emޱf\DjqX4u*vEl-"!"cmtvjQbvGVel j<@X1]pZ^FGG[ @Ƀܤ5&]`BYv0GOyY,Yfͳ6kRsK%@\) mcys)@c! ()>J(vYa5Kj862.R9LJZf)@[e#"wI%&k@q{en$b|Z\DW;R?54$}4\;q*2Z\l"f 0hT^b`HH  :AH0 yfPZTCҊ 2k\q,^_r/ 3ɲ""iOܧ~OCSz`?cOb^ 8痑py4o;O~wp/asZHϿ}r ;*Ч'f3VȮ`* WsOOޕ}uV[s:{/~u7Ten)Q3UFk2A8͂$itNE4KTz-gIMQ3Q`W`l! - \Y{4Ycro<%1dȆqѐMô@K`dފbt&&0Ӊ6kL)@` `Hc,Y jqPA;eF[ 4r ^*66rV m GJLލ/Gj\m.@p[e䩩UprCJ*>T]:*p),n3L d[D< $D|8baE $헌ā'k DVNjjf*䅕T,cdkRYEx_̝._CC)l&pbL<ñɖ16o]Kɘ[6`r<\]3%^ ˿F t82Lg2 nkڅr31VIu֗1v2جd"6(e OKx3t[z\x5Z=CCMn%8Z wJP 7\uTj5Iι|G| ¡棧b$?Ve]s`,Cf6G {+ʰ7uI9 }Ny9Z+hN+}ԆC#! :Mj]PiDtR`W%:Pj#c}e҆ cǻk#YVTgYQeEuVTH& QBT7#_qMAEk,5B3#k|?aC5zxD+R\Sy@$כQ%Zӓ߹NsCHwnL~<ϱk?k(lr 8e q뺆jhӝf}YAaSn5ÿ?~(|5+|܀jx*;j-M9=8zͥޗ#g ]u|?~Arα=VGb~/_@nA hzٸU5j0),25Cr!\^ɢnT‹-qO,YE_ܢ_;渝"L׍zw/F=zt7On ߊ؊s:%^ޮzTԭiqGhK=Aj,DG FH)J@A?^(7eTlevN(LmS ݗ`$esĎܧ=IܷoqLIQ1FsbEsŔs9 w\cWF,]j4D"ґLt!r- Z%^Ĩdc⳺߶g) MGhSZM\l@"qaQIuDӰ־YW4 ReA)-xndB61& Jej%o_:19(o|օKu{AO]2l|{"K<5) D=mU1W5ZlneV/dz_C[\I#/2/UEry:;X`o8{^`4o3r%%9nKnvĘ`+rwHV'gee Fﰟ O ñ9aK _Sn éN>kg$K]g+.iB,ov)EfʉnG0V+mv!I}F  'M6\)r@yGGˍĻF]5NG %Og CIF&4  <9̰b"F2i *o918a֎{Ed >wCIgU`:`]y0՚gpzrs>Y9>ĻS/1.z!Pc4ry} rX sqSri{WŚrQaoܺ0n>~1ˉw./1{ WԾcMyfpVnj<ˬӭ);yӡb `2ѧ1NvIMTfr<"J^Qo`YVe`YVu@#M~q-Ƚq.bRyB d)ީ`mŽj_TM9Dldl ٹ[ /ߜ\޼_=7_<=!7N?tl3EUw_NJN DYO K0FE퐂x ӖB C" ȃA L'YY]˯F4ҜF2U)ĹWEUQ ۬0Ow^+ BΨ˵19{V ߡe!~`ON=ZHYUNYxqrYm9lp==ė G0brG01P1ﳒڵoo+ 7̺R✳HՕNtT@=&GF5)ʕ@7x,.Ƚe{T8>c!`\T,a:2wϺbrMyJ,TVmF(͕q#Ɋ/b־|A@zɐ944y\g$k4yO 3koSB& `tT2L0#oub6`c"Fc\0 hZbt)Zۋ  rٰ "YlcEcB䆴;\7زq-r)0vM(^=FκXc{ 'GER 'W?!v?_ҬR{\\_Gn=殆9%^q`/lZݟnw7CkrH2zdA?kրcrY,Ѥ2J Q4&IQz9<jdŮ-JIG9=m;URy\]ߴ|4-s7;HoG\:L7ͥ;Ys>%qo^q]6ϥ T RFx: 28,/2BS^xAd\Io#F}.-zfYY M}Q4[֡ -^'FJ6^YZF`зD4Hs$"9>Q^:Fk2S."TKj,Ir֮QBQ G[ -e6ZiӖ5I ("&9Qe%Ye`_XD@Y25^lNGMM:e)mLEfM3*%Z @yS'01jo/3_HeHnQq =y^,W T/ϟx" |*0fAR{ 8߀8ޗ=$8c}۳p?#W(".{y#Yqv}:#E ݢ~-=Y7WL)Tp&v5z_Xjł)Hl:vƭKz;WFUsW\@ 3,s{uߑŲ!Ji|~%KL2IˍP3;.X950TY2ݻ\]7ue0{ҽGy~X//1?3h\Pƚ<*xCdR] =z}y8?81m>Uccz55kqIb*<& p'ZBA-܁`7mNM@ZAiqE` FmxIƵ`i;+QFB&ǚhؐ<8 B:hB|. 0Г&W!emtA NO|=F%IhE:@RJj$JE1*HѻPq(zW8J?b9S3h[TscxqNR+Xt*9j}p.rߓMZ-%Ogec0QF%HJ)#XXYZ&+yyrf"-Tf%}E8lge1- 3;^g۞ ]d4 wdh_-\\-mҜID~Lakb)/ߦ>Cgo\%u7/Ҏ+^Vׯ(cPOMϺiB&_2VEJ*U kns&};.cej~<)>_W Ѡbc=7M8F L q 1뒎CQنc} uЮ.;}S,ȡ6I[?Ku**zyg(Φ&YK&3~q{}u^V5#,IAO 3[%6+pb/.GEbBݡaG=7;{1?T7Qjaԓ,)\4&\Y\H] GFF)ҽ&JH -2ir{޶%-I߇L ^&ƙ<Fٴ %"%Yt[ %ut]]8ٸ{/'S็JD8EYzՁά*e1MM ` au^t^fT /e*\Nr+%F%[NW^"$q 4{ouʌqz}&tgjٲ6{E[ ]UPZu$8Q2Bqbd8lc2Q1C{ƸAou\ef :Ch ~E@ZQbV`MdӔu:mwp}get?#?)!dk]<^&/mS\{'@07/|i<9HEO XRqQg0eG`Bb`5u4p| OW5j(E5|E?)W8g(Aӳ_3D㵠.6 Bre Z`IH3W=G cXcz\Ҝ)٠\ j N: ɚPY5Zˆp,*w "0_j6D7X޲ڀ:ke0r-JFgkQDAZ! frUj G%1;4N0&N&.]b>BX..1Ef@fh*8B㓴8iѠ1|A7E6?\"ɧZ&t R pAcyXNDguv%. nvcRZmJT6!.av>cٚOJT(a;t NO[ဇ9e( T5"iIy 6u@2oH왒'szoԯae Lg=7%51$6(GYAhd yo8]slXSš:+:Qξ^uհiXT`T#C0ĕ6t%1 +r|qFȬ%Jo )Q[(.h(s 1Kīj35Ui >UbE '>R ޑt7\a~+u^P%2Ŷ\ -s10; US*1,V~xzySrZ+ 6 WA-idˁbk:雳@A[2)aAP ; X ΏM(CnS:i=:=:BA,ZFh-CG [/y8/;3tٴ OQ8SP{ŵbLB!͕sZ ):1 DK-SZ IMQTTEUXHiDD)jebt~JPF |~s9p1`wr{ mVO&7(:MҨ%He"C?48]_%`)$Bؘ{ :V'&>RY5m`L(%s}Avh$L4=t8!| w|>P?/ zޛwzq&}`TA7u]||]| B;Hp85Yv]wV0,i@\[!jV4x{hs\Eܘ(rH $a* cA wo+ T9"eQt42XisԷCWE|2vG>+A!yl`Xc~yvߏ/ziwwͲd_\6/ثg{/x7wm/h nyzዽg/^|7z} {jϿy3O|~p|c]|z?<3xrr۫dzƧ!ûsy]o`/?`^wf i/o23=Rs&o{6_\g?Ͷ|n(\^?w: 6~3_vo߷3>oE_nPYyxUi?5A:ۣ|16ub t;7#/  dUo(Iŋ̼̟ݻ_3|=cƪW9uq'ϻ&;fo-gl%W:{֚v|/i_rs\㲟}վQ?+pvhs_uѝ1U" :\hp;x;poWm") hr/%%T- aF uHjQE!(Z}ŷ+Rx'fkJPȮ`IL|m o}gb-sg2Ö[g2Ö̰e--Jv–U5Gת9Vѵjna`Q;XweaZ]4Z+Kb kg8`NNl1vqe}}EGfl]7y5zwk/ݪas}mt-H{K%>Y|jEn;od-H{ku| Yk>DmR5\6}1nz 7gKLt܅T0BX# `X2 bmѢKU..`X8Q=~/ڊHSvM;e)vm]gr/>uYs.7 pKlܢ|DD(bDyCY~R%U)dCWFI)w j2كXꔲ[$n1j[Դ$y2E~XTC-3kR `fol_#L_̩uyPۛ' ORSːٵ*6nK$Ǝ®Tq`jUt*C1ЂGZ˺!&PהYy#r 9F˦ %q UxkC]u)$SтXsm-&ZHu^)P9rZ(2$0l[0[3(ct~T.b5 Hr}nC}bؐD0]5k|V9Y{E*k!yjLljϠLN9^ԦBm :v:3B n9ӏkkYgAXqҹe+1v)\!M;)$t7dVԆMmԆMm֧69땝u qӇ7hcL!E9l(q'*vRVy[FbVyD q b3G BMAhg9KEvC^jt 2\;f~f+xdzRDb UJY0SG߇dJ9[\CAVqWM (K.kҘ|,*'HO)|ڠn>EƸ-rh~w0GTGwKxThh_cOs>ӑ#=ƨG E5S}Eܶ UJOz}n˜]_Hb+,b=X 65~0(Ϊɓ!˦șr갖/epsS=o?XhkʘpAY;_vT_m's8T:^jN--YMI!iCJ HN\SU(~eG=N60FObxxwk[tzy~:B(ⶠ$It~9qR..~ 9m3Y0ƮzQ D- ԂE"&hFH%?Pu‰*gm,s47Ww@,o*Q;vH[ Rd bX,jQbFECovE| :ktZB+S)@iw :i1d` b]4t]DRV#?C+Ŀweq {[}<{B}3Ɏm1C~dϼrPoYi?h?6.C pzk~hzYxC*[9DX\]ک;^jY z=Q>=5 8[f<=(?Rһ x[?vYd寭,YxC%_n (| tb.of91}%c'bʞ[_x F;/';0>!Hp8pc 8wmc;6TPuꔰEEh?~ SXhOͥ90K`/2` A{& 9{ޒMWQ|$~ZH ܸa}j#y"q: ?~" tke8ߞr}ܸ1 +,AcmEbbef $ɫH^@; 9'`f_-qK vhox;KkPǓOÏ>phjaP@5'WdvS;sa┮Y~,q}`),7[UsejHoQfQqr= # TmwαaE O1PI= ,%jsYz%s#z#Ģ*Ѿ+ցú!*h h4, k ͚3x!WgKYD9[벒Y}}zwQ]W)<{6l'7IW״޶[,{ۯ~g-VOU9,+YLKkP5#6 `Z[P/ɐ358 FcbD'lWgb y>{w߉|Dz?~3`xM>s318>dx=Z-["~l(u'c=R{V*}&+*_X!*mxQV Tm(vQ&BS1C%5F#Eb*9^>sbW+ 3CxNHI@%g嶏1k98"X:6X@5| ojppkCI(3<%ό~f3lx';`)Yl_dmv!.Ġm4;JPk#}zzz8rS%=eqĐVkך'lHX/(ړ TB牳<(:sbX^y)[,5\yEkkɵ^baԱ[$^S@6.cMmAk q=NZ]Q]~~zhNQs7OJ[l""5Tc+g՗g?mI %o8l2.C8Wd4SNh*iQ.?{Ǒ CǍ( $2Ǝg/㉙ݘ΄"m!<Y"EGW۳.6_+tՓ3!\^Ei]"#! f:9>/)g7>`ľtC y-8 3쫖'/b SP67eW|%֡)jt́l&-2I3ӕ@(Ǩ8o%o~!Albmwa)VZw$JbV+$F4U(($PIXRsA܌EEFV`,bF|Y63@. vFsusrdm$޼ Urg+ ,%dwU(`0RĈoIYfpӎJ N6U"CoqImϗJkb) 0܂w?owC: [Pwu>$d@1 cꨠ0kU|"@j9@4`N^4dmH>i l4Z6Խ@,Eͦco^H"kr6*aiѥ ZZh0$\%'SYGPYr|H', 1VOo] 6@*O5o/[j1臷7z4* 5P@]@054,G@#ΤB0$Q o;K6JGɭBAHX8WV&,&fe#b&]lZgsbx'^,'Oڀ/h";Ƕt|(]`Ƅ Aנ I39vUTlfNS So{[5ַ_ˮO[o.%5٤loD&.h=\Ǿ֚o=T[z?oXm<;~](bn5MWYk"zԺSXH^n?~s P[7|aI.^]/6[톢Bm8csήMFjr /'h.dBqa}?~{.JڔeOYaTPHRBQ٩VTj(nJ{,AX*` , CjJ%YaEd< 3j` I= KL2wlL RDv&9KPmE8 , ݁JKbwpQ'=朝QIRķDŮyq&a\#a'Kn总$+!t*Cxh**?wpQXW$aJ·B܎{֬ֈ]Y΃ M-YxCfs͉ǜ=|gXd@Fk݀; ݞ[^lIzR!oғѫU{1z (~l$V:V QGաkp =%xC/+"I3wY [?<ғ;˿%-'~X֚pw Qn}/n ;ă h7evd29:yח!2fdm|J?mw*+[yECEJ&+~NˏWU]L&S"GN>9-TEX;ߖeb7THNѰCu`3=_.y_DLs4sʃv5}/[{$CJ QkV\\k6 [3N]%.%`KΆzuViO?;~RٺǽK4q'v&fٻmy 1==D~^7Q lˍ|靈}Pދ!p{yom^ֿn7~}0.W? ;^m}©v{oe' |.϶g\mJ~oJm ]IX?@Vѣ?WIx#w6<s*{4'.ȊsYuP|.ȩDgSc-HW?ۻC0bW0͗0z-;hJ77S01K6-r&t$ʨm·;S:&ʨTMZnX)T6>dZVOn^*AiP>r9  9ٹd`ٕTl΁6ȣ h4/u2mDc1*f=cDopn#Yڞ/Whryc(w?n'ί. whQ6L!㒕rN:p2H3~&(m]4J$Iǟlo$MdhAnUM%߂ Lbv }?PɠК"xktK0ioW=)-C{ʶZS Fσ\Ѷ1Nu8,)µO$[40ɵ|Jɸ5Ds-*X+u)V\I;RB۴PGW!*$+8A>z_oO7G KEz1Fw"IlcB&eםD,Gb)G.֩lҾssP^BbkLT X4s=_(+9QM`\|Ɍl6MN$&RVd^ 1F(穦DŽG#/ wQk>;UEsIsOՇأ&z{ig!US$w!E[NP %XME +00iU|J.,i.SAC:;M{9$2Fɚ^ jK}U%I4|ݏ.\j \ \kku;Q"țE)2jYm!9[tmqJSX'vp$.L9H$96Nq).}<)ۉ;$4+#ZmA;zno_LdV2϶[;ƝzuވVp 58s.@s*< J$9ik[XZ')$κN@靱6l,?G8zaAޡE[sGƁA'ͭؕAyB.d儱$R&H;W>\l {mbn?uɰ1°響L^<϶xl>}~35ЗۻJ_ͳyZbM%6n)H]5_AA58䯺!i^}3BTv|KY"hwۋ^t6lrDR؜Cmʧzp6姳MXʵ._7S 6y¯i8]GNc^ BdFT#lZopѢ/y,Zݫ۫$6\_~ -Bu f HqWÝp\p1 q/;2 ԏ`|?_}sxv?ߵmpcMˋT>Ż0^?z7iSz~@Z^( l!6fZ|"yUlVxra] bgi7kןՋ~TXczɍ_)􋱆GƮzFdvWhI}ߗLJ(}RYHLT*CmTJLӐ*s ,r7P.m+# ]$y#XJ("xtҌy`JMR̚s^/+;j{^WTq'`F Ҥ犐:p;qY\*m+%B_Y-fe w7˹@d7hn$r(@d05^*bYgLVX܃+, BYEaa5$ ΂;[2I$ s{bh2f@Qs+ *sƙW删VqjP K 䡑ԫ6;o짜KU~j%9<ûkPy H7 gZ BSڌ>LA*]5k$"Xbdl-UT%dl-ZkjP| 7*~oA5״ih|nˁ 5~W)}_= 붤{ gNG/S$<x 7#\:A5Fk:So=/.-͎'kOW&,\3NIxϻB،0WnBgu_RSabxyP{s,X/hk?΋sׄ2]Z)l/^l^Xvj+qp?=fZ m[oms ~wkj r. ;89t::SJx<7j"0U%2oP)9 V~0,_WB9$R2T{^[MAG\kUj1ڑԫnjlɨ4 NoD&|n̗cbg;}  %^?HE{bUVQc@5͛5g-^x2~gBOy:F7gpg0v:C4:mHE.H#H593e7A|Xb)kM ɷiXl_t`n~{حG_n'g %③qw܇t#gvOWrf&_'O稗2+?A.J{KxWK~4 o]p"3WGD#Ici\s$t#xERz}x/.?V*lӓ^wuuvfiR 7{oxoPe@U!hþW_~R`oBtV+2H&΄_OþY~*ˈ]Vy`92tH߄0?j7f9M&fۋʸSy-$%S0boٽ}NИj<0}d!ޤQR!syp54f-bŝ"hցYҡw{W]o~G:/pfIc q^e칀د/yh(Mm]:隄 U"J/UDD(Cx-Y/sͤT\ +ձA twocSp3J[3H5Kġe<VeCػJ$<+({~=X̞9(!yP4ʹ6Oe:žE{ ҵF̾И"~m_|N_R~+HsXb3Ńr}hn?VH8\_?i\̞9(6+ӯ㳼!e/hj1qC#{n9q"P^?yA6C]aGh DsEJt(#)9sIr(*ֲ3ʛHn ! _gQJ#hiR|Kꢥ]!Y mo6~[MM9˄+S9wd,\d є:)rԤDԝ<~uc(BQe +JzT/.ه)8OpG;a~g&/_\nLo7c˻ߝ|ڀ=:mwmnpP>m$@ExK#fHi &B s6`*u 8:E:mu'?y=eoӑdk7APpd֕Fb:z1*_Td =D/ T#/ df6Mfl$o(c|(T )ȾČ^@.ȷ3^ 'sE[ޙ$upf%I| Zv&٭zYӛk8Au؁Cd+tmpmEW܂Y-LwS\8͘4GʉS,)ѱu'+Xwq8턑mm0gmTmU@< dr HrNM?'M& E^>kP-UJ Ztb{ ӝLTh䧢jkQ v2 E-r%]nUeJj&[ŮKaM4F>`%$XM-ZʹIM%j9tlD!;{ ^SeKU7Wi^Fè-'dJY* (@R塁<xFSH![UCa})yo&e@#˴fז(ޑVXDR9UhZIt~Bj:3;{j \oe>ާ^e5IՠRG*IDWKxb4*rxg*59.3dz_I`s\WaoF4Kqﳊb%Dwq,fbUoHb'v>(^0EΒH\ZᓱsA1^A ioAڊta:]o-o/:8s*`X;{r@z+M8k@hqnb3!Mĥ馍KtB4]pH2fty iG;v^)4-T# ~\7꺨7 }ݠt!A$xJr ][d@v+Q,8Ut0IPH)8@g<_" -הˑ*L$fB&Lc"Ldۗ:`AЁxE'ґ2tYD<-1PxeUGo} 2HxleIӞF4#O#CaCHxbI/(l\ݢ_H zt=Zy(<=*pC]䙠[k'1^<ثxR(: Ƽt9(dtFpC>_8Buf_l%[5 دK hh@^ huk T} 4!h??/*$m>ch4*rARyḗz@ \6zmU+j E!2'S*ӼpZ i2O(O>Zd ^ZqqFm $n(QS]u߿u6(u2F+ ˝Z^`jf-u*9g=PP5UW}/WtJ>0!fHkͻ`M5%һ0.h;αVD`nW?fڏb}8yă|NuSG10$HUqǚ589'#/z{9ղ^oM-/x97ǣ˖ ACZ6}rRkLM,//`z! e Rh6, BHM3r~DT$4aD%plg #c!XUX7+RRA&\XM ;${Wl̔Hz*1l̸i]סKbXSXOqAi~"W@ٌEK5,DYc^q ZԲ O\ #H4xV @9SQu#vobĪm; h#΃T8f|cE)KA{f+Z)ӼfPNy2;Q W7kOF;:Վozy>(yW/;ZmV:W]m:8C?k]’}PL ]RY&fE5;|lws Ɍg[8q㶈sX>b[]³M*},v q.Ԯ{$4% A#.QÈE7^e`7@Zo}iɖ>\؝`!'D?!gzc m6'M]E\SE6쑾4U-nf7#iEɝ[Ƈ{(,]8Q~ 6~~%2Bʑ euGjU{QInZt!ٿ[b)OjkeKEy3&!W#-6UebUL:cLv*e+*G.XlE%-uHf XK="A+) &>IHr|tXv*mxO=U|Rz1JL|_"*JMwr&hIiEi'QY+7Zay_ZUM{+1ᛇ/$>iT+ !ZŶQtq{j):{sWATBԆ_6coHxV:z=9:;UbNEڑUp6"[(˶Z*މ*@VPĐ$w\A*w&WM1JrNOz S#4߇`m˱醳"ujZfDKzN!ǘ Vk{Oz}Li%PzfE; ._[_1}j0qп~0J.g3vV8qUYhV>Xb֧Y2!dy5al; $!Iu-CPvh.Ax88ifNUv،'Y!K=[! >[n(u7zv緗Zqi/mJeʐj,uGhDb#es [/qF8L4SoM ΆNS¥f0 ,u;%u;!J3ф9(,m¢S*JLw/DTt~7R!{Wq޵|&V[6m*M7vʛ ˣZ8ֈ NSeGHhA vFJ0ZGQU`ŗ=%eKXE(r ìs'y:ϝXS|Et13%¬B Д>hИHg 6!ɁWݚݚZtĂ-n.XRUڥ;ĩNĩNy,, r0dH`.b`yHi @8cWivTKtv~k%  IJLwo_W{*өkp幚go^ 3**8kTK&$6E8DqRqI'aF'8naͨ)* ) tO[-P%nRSA!= ftbp$I S&0xA 6W:_6Lp=$s 6KD$-5RIp,0K?}VꜸ. ~ht(?|xwqqPH-x31vsHGYQBIQMg꠆p);E"y5qK/VCn-l~2|č7}|ɓJ`0.A6O5Re 0TEpIgĴ6TphAgJýS·;5Y磓[ZS[IFgU=;"NN.P) +3h,NF1SnƂ#> *eo#^6x? Y?;~_p4n3NGz˳n ɬW?<>yzw5',9Χ| G'/{1>giq_b'gw?u~<'=H}kEWXW]!t3MmWhѵ,< a? \.w!fT_u>;hMMaPwA;'qJ]'CC~.?P3C9/%p>LJ{ǠY4c}(Eju;|+G-}>k^&Nado{{oAb.⇀/w9߸Eu_=z1ݠ=Ma~(~ғ?@hxfN >{> X3U 4kqKj*-=.-n"ʍ54?1'_P$s1v"n?Ǐǯ8xd򹋔\DMѾ@z#ކ^_?~1%͠77_>>kÏ`?<uI.0-h&N{#PW[)v~|}qP|O/NΏƗE{QI<ܗ!bB(u`4 7.^3c~Rx,~o2Nov~c.ߝEj~2x =F݇Op>sqomf_vy?_?geK<EL^L÷qTQlz~v|t:]a .h_0-}]F+N EY1trϬP|kL7KA XⶪotwUg{ƕW1޳{b'7M6EMA&ulwHɶbˉdNKop8q*:Z,?m#hl *27Nm ܝI|x_e;ig1ggM/P35|T>3}eY?'a;Pc^~ƒ;JCMM"[JSJށ% L(ϥͥP6T:'( ך:=[WŊܨjcR<.OT{2.ZG  $uR/mX|1󖝏 ,u[~eyH`6GU>$*uH#Ӊ+ /s~:*T|)=H".Gz>EXY" hBm1gm $;߅֧wEC2j )Q }kfW p-r_rlM ($ͰV߬/lLFX'&e¶&5T˦Dlʲ hiD!2C a,#P1""ۥ\DGC*olhR`2sc#d8V1!e(L(Ŋ*a9r+YTAh8"Y.dDb+H+" ELq[JQQwӎ aLU"7BWD&3-'DXZRv*0kw7N݅r´je\v*s.N&Ƶ]BV3@1cWl@mΈLf`TDnj!Ű$*2 Bݵ,HYJ$R6V b0ca¸eZ3rVg27n "펎i)c 4B߃`2ۘk1hm &?XEmkQ4suȥH[!$@KÛW)p] R|R䉛hs0젯ru# ֍%M'lBK23AQZBش1#l>ͤ"MBH\x –[' *ȚJG t)NcEYM~n.'r~;jg{q?H/g18kvH^H__Tm#u*f_=<q4M3c7x/W^LwMT_0%nb" 3ɰ2Qk\D+TJ /nRifm. g{[ nuH.d U ``ڽ.+ b1%%\POA-Hn)Miέq+ݒdawcq`  NH2c!1T)N@0`6@bYš֕Γ[QZ!eaj&2f`BD`l$@j Tdb YJINڋQ 7R`i`\=*"[@Ĕ4LƲ!CDPSmC.J\X­"*UbeQ e-hh;$+&p,$q:Ayww2\h!&E{֘kI <3(1yLL Nua)%y%9S 1 S: /Ȳ'\`P1yL޼J1,Ga8Q<9TDj{ϡO?^*v\ni$cI$laLs1DTHl J-hǏ2*-27ŃMаr{D Ig^' £LB$+OC|%6p|oQZt^X݋Òt%2dY"݌c$y@ )C%)-pRۖQIB XkE@! T)h[J'(ĚX njiV b1nB0Cleś̞g\тl+Sk!ոs:\QJ^`}gz3ԃrBrjFy]S%2eAl~4#HLnCUpaL]1w8evFdx`&A5]7N ո`4μ :}no/Esa5(ИB%3ZCE2 E)}{B"0Q%,]]^\= ܐGr,LL<>r{G@:L4ipt1ix r;콃Y݋{/=ݦ<$'߾pxv~w'O{x7(]%!}\_i=/_zNkRK{ɹe奀\䯵EPP^Rh!!rSb?U6k!;[ڎ0r>[c*s?MI;Yhc8O\d$K2S1aZn8q1F+![-3Zn9J^%O|.>v?G Z0:n.fVٌ~' O<YPJuuWWg^Y_{3o7Q]lĉ 5ӺWSҺ0F LSShN]<{kg]ÚTы1HmPǼB"#vh?4Jy6r֚.^{1 ,0~LSJPq ^BKLa`1`im#G%ȷ*̧`;`f- 6xVLEYRKb_$=dZ²K:kJLKԸrn8R)B{\EIhxՅaW4E^_ MF[*AnDZڠNzSC&gAPzmb8\NPiƞA:}P<+aPC9QmLmҐZH2eL$ Rt1\./ 'عc֛&ّыn 6KSqliUNKQyOCryPr0˦|5,fMV6t/:~+hkXm6[ lhܔeP/{#bnӢ-P`#NE+3akPNi_6M`Tg~ %=TKW* 5NT5%iM`c XApRibnGcN VV~{y[*-UG5+oXy#?Nf[ !%lFh)Kv,Ss 7oj4x94[E,'K<0_WFm@5NMtI?^2:))7d?\e,*Kf<S$HtX㔅il!w_gob:eL5?o@*Q'}JA*~c͆=:m#3SJtc6% h~H+镟.n]E-Du9vSE-\H`0}DpaHFDj1Vۼۉ5­xa.ܭpR֊nh:y؍#50@71ɢewsuO?2OJ<ޙz}O!A-8{E!I;wQ NAvg0uJ"[7ёyd6Uo~Ė&wCEhEB!'Op@3eS@N_)m*mU:v¿g썥3sV&y@Jvm*pm^G;ܜ:+Xʘ7z w"ϯhw(hsüʼn}|8yyYם9|h [v{R͖g=N4~7f4~D, 1~˪+{UG%emy/MX\-y#2-;3 _'ɏ͈h ]^G~x7û|Қ`v|9Tssz\SpVmxk>!.$=綑$aj'i\Ё$ǝ*I΃"49zOc{AZ`Gpl#cP+x{cAa%~Xã5 %+|VEwz[HژY#m5g5fˎd]1RLTv4 T(2jy#7wN3Zb4=`u-@*sl"#spjӽ6`\`H1dk. g)yvIoL C'i^:!OwuM|MVB}i2my!H# j٢*.׮vP{Γ]ïƜGJڝôh ֣$v}ORb3>a?˯pV]4q/Z^߇Se.BlȻ|XS-р:A7n?\,cucT^x~|}w}U[=]zٻճPo7kغaI& ;CMlB$;la?<)YgMsxR<  r`+i-Bm!ۍ| cMysý{'<p((lPye*>ϗX1ڠֲ]rSr,6r4LNPP.–!h~0H'BY%c4ejܼZRpv/bg2tz۩Xv%YN%HU[s.{X5Ŝ`O:VXhovԖRPAt+]h󌃇LP)3].C吔L=x,y^OB۫؏}XR&◿ 3y16e2mefu銒Q(=x0~4GEL4SvO.f>9ůWQݞgWO=l-T@F<6 eVEo"%hMp:!V$o)DۆNϵR\6E:HB@6#'䙡2Jo(z!Ti9tapU7~Ww M7hv1H6pQ%y*Ì Q0MY!'W{2H|fjlݿ2'c*s Lվ҈mW + ëfueQJz L3'缯!@O[9:hRIŷ[l&re1bUdNeu&ZT~P΃buXzNeM#ZJ>Y[ZB%t>/Ii9U N'>逻ʩJʩB-@Lj6o#"b/xhܰH{!C414g=h#K)R%s% jCl##gAi)"%W\YP@7E) &Z m b;nCsv%2SPQ8* R0^Kjg4bqB@Y, LQB2Jep {}IzHڵND &8\ $JI$ԠSPq(I 5R}=#Q^DisaݩDaj(B)cUGJƩOnU_ _a ¢4͝\~u=I@ԷknQإR-JSEs^͔s‰2%w%q AG4Q}c[*۲ i,zY2% h&tbt\\B=GlCioܢ?gRa-Nq-ÔUu(4b-ĉ5qg%-c.6V pp ;i);ZSu)oQzt21F!hw.գ/8c_]=܌RZ&˴bLvZ ,sl/ ʉ>qgs}jX 1MAawpF@~S[݃B"tޔ I)e{Bͫ9 py}TPyd\ϯhv;i樦]^Gƕ 鷳- sFl7u-κ"S&\f,3 P-(5ųEǷ2۷v]i\jQxYꉔ֓ Fg*^䊛<&=*U~H%JЗ#w(k+ )z+s@}0;0_\f7C-6+<ɽKC_? PFwf;%nJCxrR:G9U'%`)qiU!w~)6pL]BfA%0$.sVU}~{̅Xgwۉffhw絜vhaS?/ -LW̴2X{ee]),zİkϯ*A|O?y&,P]5gj;¯FT-»QGJE}R2x@jQt4PddL+0j5Z)vv[1.Ec*&8ZEr&.ܷ̀Ƙbd؄+ /4!>à6>Z zdž}4\0۴U/dB (#KܧTÉ1mQN@H7zhKs==S5WS\M5«gTn:eP/NƻLDaS9fB6Bg?, m+,Wm8b5R,3E9*N[c wJ<5cXf٢dDºDq?iRP5n){zQލ8{jF_޻4QzF(f6^҆'i|/MGgi#F\dH_fp{o =-A8`ө|m/#FiPW}]>Zp09SZO Z4Kat/[mT'!re@m(-O(VH&AgBWV[(>$R- JCg%jeP܅XKҠ_Ģ@VuN>ժ^S5e#UJLh2+ɴжйR ;6ʂhXA-i!1%Bx:oE(tx_Nců2_\ds+{Sxxu!O0AC&.iQutW'N]CFjEM9ն IiI}໪pr H1w*ңFSwݿ!A18>`HiU&h>\y|`f󻟫߃pv#ZVֿ߮68FD]?{U@<|NN5*#pMY0( n`((J!=~aBjkxT8<pz"4ɽ9õq5چcN3f3#DƅMK2G v"PfiPZKsv^B5f[i(Y RIalәjX`1wl O߹ef=Dm]B>su%Pz;bYNKe$7]u.nu܌0*g=2y]*l)73϶GJpaҵRF4NۃVi'i|X=3Ւ4Xl:I ZϢ>S,ψoBJE˨RZcjY[s=0rsysLcc n_ZeU'ȨTt#cഖQ3D$X])H1^Ϲ3GhEa_x]Q_\Q4ZA1>؍^pN6\MA+ !GC Px5.D&}22,:YTU'H3'c\"߸%#B,"uRe3N82pyB-܃$y<<^M=5bGím4C7~"~DE1/>#9y6ʂd Tuy Wu3$i+\&Su՘5x DJ$ !1u#avgJImFK*iR34e4_ȬN ?& ~J|]ޠ\>i헛Pppmge^\oNn林fBYX8N m ?9پjJs|?Gnwϟ?}eV!gb]p Zw:_E76/?Т)ޙBA8#] ލQ;~xPIn]?1̪) =[\],/q-n<[sf #3KHhA88>wJ UfL.DnuNB*7KO?>TU"-YI;):g0%?}tgW;D?,aQ?yXz]%+PB9JXB jWSsO5HU$N}xg\)"&{*C-%{,i;]cTјVAT.4ti)éJpp]!HN ^HJx)@$ *;Cf'?\Op44J←]V?[[F]^߽ ;_?T5嚓ﯮP >fww3Wva]%3k>Mm PCfQ [tD՛)86ٻ8n$XFf"i %- >m%dKr~íyh3c b؊+b MGK&lڰd/ z֙J'0E;ReW$d")`P,5T[ʦz W$.)yKٖS eǬ@.K81]sڿFǮ>N.W{jcqV{?[=;Hm=[(ɵ?繁Z w/gjg'/gۍBcp!Йvx)Dpv~h]~\-Pw0Z *oxf-j1gY vU÷v=:%$[wB2BiIðQ˚4` Q&m̻crvToch$1`nMbl~hǿ{qY^p{A2Hj8|f;<0|࿀^xB͆J2/h2"@Ö=^qkq̘VU;u/5F>\Zw'x:3) @TKBJ8My\#F'kmz0hmcٕ cm#̐͞qN\8c ٖmD[ǀq~n` GՕ(_?GojLbZs`c$ыMb\ {Wkzjw^,j+`3_Ǻr ôV~Jܯ,O]|%N0oX>8g!uCY͸<g5qyW\ &HRayJfkmEil*"Bp%3RWW'=ғ '}䧻\?%-kg<Z?mn<ۓ -Nl{-N8}}oΪ:6ڨj$Ie ^rki7@T1X$K.:Pn}K>|kVIBjQ!oP&m?v@h;f[HlڅOu%tdM4ss&YlC4NGޟ/ jEBi=%dK)hx+ 男igADC%?:]eu5FvC6ʿ(1j<|_P]( GR.~}pL gE Q|#g6[A*l3Mh6b6[7**avd{j}æh4~qugX%'v蝹#qF^?ZhBqUO`ǡT2KʇR Se9QT K[V #ڋj F8sZwxdbf?$h3A\X 0k ! Rh)`.G_l2l$ɷ_oJ;S}oۦg|RXqh-:jˀHZ0ZhWJ+~&$5 9zBJ5ڃA?uP 42#!29VrtIK+Z@cP!^'Z:{#C,?nm0&j1EK[@9|f xk)AG^1-v;KjÒ('LE'Wݑf.Re;݁W9Zˍ)iv^۵ݙPl3y_mw*klAVl 3p Dݩν*]yӊRKSv2ECMty|fFɨb: _ }^hȑge!ZT㦊8T~bE#j08aMO7)7' yR)sfA$ Ngt-a=9k mr-a]LBg$+&SzC'CvIRi+j@kрރ#({9Fw5nO'>_vUYoݣzWKC疧PnwOOrQxk`x'w7yWPieǦpqWONj1cAEԔ :n/0h'3;׮N^z 74GxӚ{ ӣV'؎;м hw& hu:'vgx!Dr(LU| 7zŔR8(@I|8&rcޤ+fpv(<*,ֲoG&)&?p&+j%奶EpT„|ЄPQ uX+˖Υ T-zAۢCN09O;+ nN⣕ iQ'TY71-i;fT,]|~3F֒o?Wչ$_Y}%//GƲU /-{8%2䨩>wۑݟf@8my-C,&n1df u-{%9bUKݒ-ncLl)d ;ShqbK,EK[3>+禞20Ul!6;B_6ǘw㴣1r)poVnu`QQo~{.u7~AVre]ft:fXcL;Swc]p%ٻ;8.@Ep܊txk;;xJ[rq>y}jL*g1Y?L` {g0k烳s<=5;OuU6Һc"##isri%kIjiDЈɛǑ7?2y󣐧ͺZ;y>"FҾN~Uʅfq DpL5q'{k=^؛qi?3{ՇV&sVZɍ_;*bM~7 w>k4YvUhuVv> ,l/B .?)`* .7m!w#&NZP@UVVBW#x)%RcbjχQkדwo6zay9a-A=$,[sfŲ2l].oKe?4Bi=Ti4]TU.*M*M!IwNRZJ !2Jx$(f4HҔ =RJb\+P״̊VʰHަ?^{r1sV\nqcfƤL(jۗJp4/0- Z5'tP2&)T#7rw@ci>by2&w x혥%̈ F$ŒXJ1[: 6@ZQ RI@2Jt> - ĮT!g8A1BQe<a@R0 WVRӀc*p~[p`-(PR*4Ӗ;zf SK] )$d,FzWRpm!8 :Y+ 7w_B{~ 7ãԹVvmo,4: %5l{S4f)*xp$6u\,f*O\/զw-a??>| 9Lr揟#>|~R^$%^Pq|uss " DNB,B??/_6| R5Lj͛?c"TuhM|uHgۏߧ9ClrUu}qTj|}/]NɔX%QhbekF$a:IzIIX6"pЉ3 m`>` PIJ_4{" RiZ&ǗG{.]T.KmեΧ)*JD5<b|{#_jš`˭TzTceG; %|6~JXn~.b*6F_%&vļXYΗNTTrj*\?8WM٫fʾZ7yԉuQEvǡk-ʦ/  !$VK$LӰ4eIU&p Fm r .lXasX {)ڃm1/M ?ph/ Kp =0X7) }fɀiR ΘL3]L(T{X:p—8CR3^:UZ9)@rI-1]LcRk㭍d4Gpt*53y)ǒr3V)HKFJ9qUq`d(5!'0 )ije"S\ic:c{G=΁! ;f*T[O^œJs26ϘA=xHA$"EMl*RRnISdi ,`hg ?Z@(@ >sMI/EoP:D^oMR9R]PBI'PeSq8H!zXr"0 0`ϕԇKrXL#(W'{(JɈnq82|nh 5]`SA= _!py|{}*br5Νܻ;G{k]dLΪ=[bcbȾl su ftv@tuץ7 ˻dP5b=R$BQd9=Y9V+ަ#/=m}C~ΣrGhNb6!1'1_x"МtYrwO9@#rc5cNŞ:oQǨщe$艈qn[ҙwwxk571 RJ2ƅM2W eK)d1bMR<'7~p,.b,fdJp֍Bpdu:S߰0qS(jD''H B`;Si ,N ICA#}d"IkXRx!br'k72KNΦH^մ'.ܜ@/ #2q -x,J21K|?"Q1]{|bg'/Ԡoz`^ġQcpx pq-ݷi)0pSzz7C y\+1QoFD+ Nd ϱB"/KE wOƇfKb!>VN6H[:bY0s+QOq`\GИu3wIdDx9F"y6N]pq#Jn[>LֆՃe=0_ &7RJSv,sR7g7fveZ3 1UOJpH/|~qz&]٥ѬUjr7/զM@cpp TJxe&/3DktCasұѦmQŲg:vyӢ,ȧO?RD 糲.t_+ vv{ _bul\,xX]I<J;!%/׈j ,Kf(v^6bjQ1 ,4! vZt%EMK'K%tpHLܨy&x읗8])uz (0gH6^{LIڨƩ*ǩ1p[ !O.PXHͮ]]F}q$c_7 RzQ1)jH^.w>]N-ޗcF[ K-t{RZCr n<=d`5ەs|iwF >M@ JjV7Cdn"8#5=(B)&J]=CW039;PC,@g"摝Lof"@U\*&kO|ILDXkKf\0Le-@PZۯK"u-5*!0M`#a(T*TIEՋG6(\j.CvCć!{G4^ktk7*rJpEjpśXni1b k%";wqQpԀ;;TFBiUsia;w .Z۸8AB^-o,V>$pG TR :O*vMPk(* Oܵ:RBL9 HB iǔYi'5˰;jrѴ^R[ˇNVK{\(&}J9ȕެJt{ >)sZ,r9-, Pg#^^ºp~}H2 [ʣ2Rr3[*u* fp,`rTo֬ݴ{7jʡ.e9`I{ D :U`_+j,.EU~M@"S ,t-[_|JA,Jmk U$NMdgvzr^!b (µ_{B1O xu|J+j^ V(LꁊR2ZЈBDfD2!SQlU-C#*hw=+p&赒WڪKSlI5Dpb0UGXW X!Kb0& (E'"P@*" 6Q):%v^˚ڍjʕ&)oiݸ[ׂcXr]XP1=;ҽ#7c^J0T>|CxcԹo$AOE;.ܧK4.QJBubR=VսbA\ ؞yNY7t@IS T{QZKR1.z=R>O9[GMg!31= ٍ|\6^ӛ{7l=U.Akܕao)s *}"?isw$s)ت>,C/$SV 1,R5$é\Ѻso @Q_~pޥ8juכ<ܒptMP:CE>xXKY5A%vZYk58eWD2":*;F>Kr3XLȲ7!\ 0-d9*0y0]ΌT\QkEnv'_@d ggdT~znf\7B[EΘd X/,A߿:3sfY?dBP=Ut= n|Lۭ}Fk;q4l9DAh+'_/ӉDD`pIBc5T DciPiNeq" X(+Okߡލ>.кWy1wO+[b##~,mhW~fⱕwTe8 C\)MoN!Lȶf {"1Ck7|Mvmḫq+XDR `/Uh%G(q@5SiI%5z=p@5cKTOZkp ~*QƑe!+5aH Y+YGpi2V$E6֑$4 % )s+Lhp HJ PƜ2FAIh+8L8h =F.uMJBɬF$IXQ Q*1xƚ$ U%$%5q={pi `" [aWdi"amDM"&*;'=k8@rTL9ͣo 6Luϛ x^=6 G׽\O??ɡLǙ G|X~,nv ,+ۇ1jˁ_~BֿLrϭz_s>%06\E%bGD3r@x]8<#ؑǧK E8q 5Ś@p"\ؔ Fp(aZx,ǥ ǜJ8 tCe6YO`JC_XTH7Ξ\tBA7k7N |?hâղE1PF?tq6ezw5?_i?L?Q!09Z\ryC{nf^WYkgc-Rd-ĹB,DcBm8mI2-~ގV7';W!:͖l]n57zH7~_+!Ye8Z 'yFZ夁O88"2i5"!> 0 ԕ[:R,$ŸCCeX(ġ4 0D$LK$X2.UXaER>VӡB#̺ KBͩTjg7Ks+P-Iu$/]8t #AgG[djDr5NqYSgn9:|ل/6吖 y8qNNbTci],ev".@ː+9Z.h95B詪ҊMs}:T~\\C!PLHI{kXL`Z,s'24JB .U? Qg]fzx‚[HejYz1u;ߏaep(J{9L"$/bתUcC&7v"Z 7oYÓg@8`Ft1+]mGHdv1Injtq&^}e. uΰ=>%Iދ8!"uxܧ;\ V4tc*&n HSH"f=LkBx =SlNOIk&*dZ`nbrZVP*Y8f_M9m/z-_JuبhK80(24ŒXk2cu\ޕku<)`A҃,91Ps>s8Ru鑯B5ȋQF[On<0B#W8XkF湯%t4!9E̓okW blNs;EcOBN(ZV3c ț}ޠMNCY4(ƼGo>eh}X=W;:X\)O"f2)4 [DMN:  a9Sb!yP/Io>)(꫎xH ^Ll}Ɨ ذN˲qB :%ǨHxX֍δ+!ҙI&w=np!'X'x hQ{l仿jRG')b"cjտ:44\C? s P0tCB т$)!C`g;6 JzexWaVADudO'xÝcX<)^(~Q6Y{J,Pg;0S&>2'!)6P3 [4##Պѹ^";:Xj`Q_'Fobb-5X ,|KqDѱ 3.:36[.9AՇ ;ޙ.XcJ$j{REi63IM&(}hn\`&cf?JHtɝξ"^AYO&T$Av4\cS[urc^t^N%ƔRNBi_Hq0J;gB%(KU=0.*$3ILBZb`eDCZT/Fp@7=U`¦.aXgJ@8'x8Nv?zIm{ޢHq~W|o6i` m`Up6^d}F}4t}mե~=o:0`^7bz,8I~9H*OZa: _M|RlXuY;n(FU43u3 Z%hLVEھ"Z)Q)[2栓i)\te%DcM;ʦP$𜐇 PDځu5qWB6Jg7__m'T+3_1i-L YzrsҌScMq}Ė 8Y Lj*{d.hO#1Kb\qAn6&`4lǟm A'f9y~|pGnkN_ݴje]wKYUis3캺5ynTl]UYWVEXǖo;spň8dWѕY V]kG{QBʊ-\Us P -m]a #xN?MbǏF0`>C4apX @bN0Fg:?5{ N]F0F@3GBŝȌly DsVr Mf|"1LH1>VO<r*xƥt!9!4 ]hS-Z .Ga8Z>f/Hq!$Kmw%ƺXt?{HNq*'(hb4{ķGl F`"q @|[Pv_9q6:a }0t=ˑ.Z\1=F $5"\?Cp)\!E3 S6j*i#]0 ,zS4Y?͏g}'jxb|u]^nhz~s7h$B߽'O0H^vF˛h 8G܅qGߦ\凟^ {/n= !'^KL&,h{ J;ok'Ќ9hٯIRuV<<?&?i;˘9~͋)߁)oo~{߮_*^a| P~(Mf'ypbZۍL-o~ XPOM l煉8I >7rP&:{BJWHߏ``L݉;=H-n>wǥ\as' }R$ףء0g˪pjX}BML%ݔf4Xsg>=|KP3"؞kƌ8XF9huj88Q*,3PQq*T*Parv4 M)X0S K vR;14ilF2 䅀d<AxNt}b)I&v0)_{c&$ ARKqHƕ0sIedс'YtI8ɞ{'RL%"FRDRH;9q=OE 0>d Qr /O::,'t &7㻄QٳΓX}rL߿wn@F'#jWEha> f-jѽ GS̰+i^H,)3E3*8= ٿo,@_@|IwE?QA`0 0t1 SfsiJi\똹` )xv},ޔBWTY eyF*([Հ\K|PiL/%U螹,39!M؝㬼`pIs/}L%d ^y0\ %*ni\C6.N˲BN:.fU;+`#2ZzlGTM\0kO驴%lB= !LI<ɝ/QZ[٬m-=xaI7 "$!R':(K2y8p)fXUH…NXzs]f\E" mB8 C!}fX3r~J$@0JZɜwo&]m56_0N bDKp uSt[peT۲L< dq2a>Vk4qPif@&bLSbTRyhRCͨE`JP`Ac^1ZQп C+Gg!ܷoLb>|pKyQVQLy2R#.wwAX LA(("hI:aT{kQ_BC:R}"YKe3_I҉%T(j*KdJ=|H)J^(|P47(gk`FcIQ'2#VqnۺP^&Sh&yIÌRs}yU"I©xl2\FmKEs>sl^avU<rU%xA :}39?x"*2fK (+o'+NZnj}-fɦD'MjɑξSCqmMNI;T\'a"t>>b@.4`wFT)<>Q,6:R0@; $.gΆw&!=3fjcbu${dbͶhؓ`N9<*)+>]HQQųᾸK"YO<'ʽ= iMew\Q; Ʌo#?Ga:;2vAָɔd,Sz?}p-qf9"BJ؊8yσU/Hi=%8g3''L4*|k8 Iw/cƖ*)$s#abYrv2٣6"fwmI_pXPX8p"0 V,ER(>{830a$8UuuUw=vO{[ vz齟97UL/ҕ %-Io/hG݊sIcM=ٳRx4/U[ݛW7eVU{|KjED7H"D )a؟IJEA,)(RĬ]#Gm/,`I)G() Dt9Kk*130XҬ"gOXZbb%vBKZlϬ8-)/X=*W_FW({Y.g'²e>Dg5ERfm9,Xt.iG) Pd$-ߤNt`R£ rm^T?"iR #f&}~4DBT){Yr [DJ/X3)9|k' s/eXxk6GOw?Td1y g9Q6/T2ٲUM7Zz9kHe0D/3k-#CևfI)`pT"n)-zYE_]U LY2ፌmUB=< 2$kKўr}1$Oي,$ չ0 įu:c9۞ _Vp-`Ys$'"QL$MƠl,hyJfu)!ŕMv@qfʥѦt3c/|}lzl}~Ey}z|"=O E!_~x;yieߠnHfC]wyّ7r `fD2h*\߿nzȄ͋enaE/_N_Jjjo.OĶM;HVƓ.l8Id|ᱚymᝐ ZOt o.A8(wT<ʐ=$pf$z5Y▟8r{%?ogQ|^td\6ɲZt@nT[ : *:۔fMEV*6i4Sx8a],ޠ5ʰڨ*6 a6*AJm i*p GZޞ8iJ q9ڕ,P:UIƁ'*JXO&;¯ӏ "h {kw͗K_ d( Oy:r4w}H k"] [zs=FPaejݖc0J;Z3_3 |;F R! V7?؜.9]õ !dGKB14G?sŻk|DڌO/MZy\[y g%׹j>pGݒ5Α0l؝*| c4_7ЕgOq 92^|ؑT^N2I9muɑ"؄7+BZgÜ48!@J8i.;*R̼̾c#dK3H8b/ĀqN\1"]8d;=|i |\(]IU,}*;=OF7?]Mn~]gg>ͦN.HҳJy[,}am !ө#UK[+mIZzE:8yNe2KhJK% 7 xTM1ƌ;cbM%&1v6ee7qJyW,K5/YB?K9\cp,LAi勓4UV|'u~Ȗw/ף|QnlWőχҮ&F LARڠJkDݍ6EMlCJ#9{E!BQRYFٿ!NK3֗n96"G _ 7+z߽{\FR.\9!Oˮ7)ag 5 ya!opReH趇d x7 J,:&Vv(mݦEA֛+m5j6 OjmK% MŤ(EV)8w-Ti c AeC~ b=gAVYAxȁ#xz<Dj=2\3kj&h1.X3GjzV7rD}A_j+9}oԚEoFڽx=eo4hzwJ0poԌו\ЗI]=K̙_;F$ӔQs t٣M,Z!+oL)6UdZ >)QVV!]$Ԡ:=sVH;%]A7d/<_9s{6<3d_V==RjLk=ۮ9b ܹq^^ac{aرj]*i3|a=͕fto%uI+B5=aˉlˮz :夳Tʏ'(ﰃȈ./_Wsߒ ˞#$Cf3~V|!rz}⟓ kLO.%ظK8۽7=,@% 4MMJ5MӐ\D_.t(#JIP̨ ޲zf>?p+lL//z{,B uqWy6~΋OyJ_~d YZpr ${Z֖j~îíotu]Ͷ8Y@{K[uj-ozA;}^DX.dVz5p׿Ib;J2_ ;flq_j'lx(^)ow7U.3R`^j{ y򦪾s眿ٴnM>`E0g7,b12$i}KRD 0ZaK0S̢fwZ"sʕ3%i]`B}H)DeڃHp/E{QtBC2lE⥏ չL6HaDǢ˪je3 #>D. ee"m@h}οMy={O>l\vl}Ԩ!q܉3DSdI1( N+:tP;v"I!|=hSf(}i/3Z`}Ԅz$E`AIޝ$bD1${6 Rz.hF\K";=V.Zv Meهc,uhuٹTbiWC,|? /,u8~sezS xPu>4?jEe_鄊IӷvX)āR ʸ7tQ U"M 2>|\*@,?޵q#",/!x$:&0Bek$Y?4Zn-6[Q##UŪbqnt~%d< lP}SwKJ7>W^< nGsѣ:vC@@|h rN"xAAN&_mY5! ׅ?4 <<6p,fKA)lDjՉiKvLS08*xiPOQ8{cw3Cƭ=WŞzf gWtm+շџfW`{kOR/Yd"_*C!^~:BHQwU߮@AaU%!5BTJj5鈷4 9ZPkeU>VuO7lfGe?lJz׽,5a"Դϯ5f)f/>cIkllд?y0zlhM̪L.Jl;5ڞ_}4KnZ@ɪIB.\Dd*HL"jN8hW)K {QI)N͎攲 W\\zg&3ܓ=s%@t. 2&t6`w#t[W־:k aD(=TEhaoj:쒝+?̗%]7倂42*XFD"JWM"5z^^}џSOLnEZ 3LEDKsq&S9KPT2F T/X.PՕ1PI(SNk o5$N+0$gJ(9HU4špNƾ/!O8iw%vyN NNdPJA`7BsNcqVN}ޜ3\6/'0\9a"8]\}t;r8T2KIRM9ٕz 7m@ِ֗_t r9gF }=rާn_ D (1pP+ g-nӃlP0Z_=ӫ z6Dry}3 kV{'#6y^I@vD,Tٷtd^ʶ^J_exN(n I{8tʙl/.F*JlnȊHɇCCia*t65(ET31T|,Mq4napܚ&0iID @QCd(v^]s7@bZCO8]5o+y[(ef0qdC0>#RW]^j:=iHtCUN"]2F[zMk[[!bMخk~j=[;(@T]ސ㾼ƨ}Vω^+9MxcV}UWHи荒1ZH`y8'SFk'l Ov-S lL5<>> *.ژJۃ r#A55"y$ӑ"3b;\}Y.pYqm 8ׄ$-Wxz3~g r.!Ylkğ7UOFˏJxxoGCA=Ix; GΆm}}.G2daJE'uK?ɮe[L>a^s^a4ߧ6Gxͥ 2w!sz @ZNfXޟ wW'][[q5mZsғ@>5ע$W)|-p_ENմD8\_s$H:6)PVѳ;ED4"49;kӍ/b2N9isj3൬?|0IGס44 !VmꀰC sܡJ= !&BP%The X#$\PAAӄp,D'~"s`2t1xg FK3ۜ=;=;aZ\]a*HԪ^7WA՘'Ώl"'nE (I1[8S)LhYT!IFD 8&GQJsׂ"yMN66B!z 3Bb`PH!J*IH,"3kT2l N%ưѠ=;I٢gH7|Cflg/TCreR24T'5G)d)RL3/qf[-B|&hՙ s|*VT G<9遙̛9m ʅiL ttCPY2iBP0͜'ξ;?}a4Z94Y/9f\8xŐm2 ڕ;*dd!n] M'N@\0NL.[tG`NI_:x[CF Rw~폹dj?n7M;7U{Y?ީi@`;XT y὚lp;pHP',nٌv~HKmSSzKs(^%v لƞ0P )| <1%߰~1v))A}:"&pz$r1d9[W azҧ+f maj7Nl-߿!?q\@RY􋚦6 _2J ͵2!B@ Z 75oh:0_j{DYwa)FI"IqbD3S2DV ku2wk2lu'Rc͡嘆,PÑ_t'%Wj8KƠǂ6GڝMOA[ENZk)dR"N,B`pN p59?̲d,9O@ʔPE:""DHĄG(D_{aG! y̶F怱m h#kVEft G8sm\x*~WּTV/=q8)xi΅' (ᆭC’طKXZދy^^9]vD~1+쾁JjqetY3qv*$aNJ\z$ q[r6|03Ѿ")>-AHXn!(GHbFi B(Ř*KUia :ص.ۮ:c ǯE.d|0>-ƈ~$"H&l=cp s-+a$?x+m3pH[Q{AaO(^^&iSՉ] .(`˧U OZ&zIVβյ`| cRzzR2Q{s#@DkZ}ͥ; Kxw?=F!3Upw?_E)NyD0Q`a R_MyړK?ˍmb 9qI qJջF" 74щ)P8S @zpJKCgFiH*#"b%j a8\c2,N$UEZ"h)(AR2 %K H-}HOl-t65\3զ\lYӨE~u6@f$IQZ(E ~1grɀOBE4 B2P.fe qq&yAeV{exo5_[ PVS׳u%*&4܎&} (hrimR5o\M Z,{\'87/o`\6uKLq gKu$[C)ccYX='Znʆ| Db m#À€:]C, \mQh42)6OOgaT$$"H2 i*̯.}0Vc4q?Mx?=-C@È#aĽ\LK vÇz!j,({)p0er}M9 UWUVÒ1{mUمcb(IF>B+l3.S&.VʩLM?uБ+p7@yå5iCi3}E2’9"b evj8h/"0d,p&(~~ vqq>^Cj6 ɚ"4t9Z~989_"Vpkic1QeONXh4}{rDKK޼7>u!nB\t Rm= F#52{Nc 3D.?w amg;0vamy"4\/}٬)ųN=}%i=5_ٳ9A@6:EU;r Ut, e.b[?ܐo]YR#n?zCKkġVՅr]#8Mcg50 L{}|6#}#D}b e{޻n>O *=_BtEs/KQZesP.D(Dǃ\ň1l߱r#T4&0A8 +ttBA$1[jhٻ#tcN!;͊.c-(>#ȮErz@N!g;O9 eKPCtp0!cfk%ʉm;+L\vW͹X*cIѱZ}^wi[u vA'LLO82):iA3pZS#U/8}LĂ2zm2(BI1s.5Ą:@3`Ur*&|Q"fɕEXR|E,W7{o,(]ѫ䵦$5%y)kM;yGeлaBC^u4]3OQ~۵ߔߔߔߴ~ x]WKuRFom٘Y|^EKA;˲ܡpQa@e4"Ro1 $:u]Y "˛:8_7RHH։q3$ih &͢C.K8wģNE Eⵆr\}CzʓX[X6w. ߗZ T0z%=6LIWFa͚U͙™̅r9Mk):m-{@Q]>|LdfH23 LYr/pha8Cd%Dh;8-Hk,ĜF>z>:Si*Zl mЉt1gdv%#i_l1hޭ-,5Bಫ7^Yw4`lQ^[4貒ޜ$WOO.'U!PqDkd$0nwɱy3/kG]Aozos:?M|xh/eh+wP hö٩2&7 )# AjVguiƊZI:<˳Y&:O_ l+>6(<ݠb6NZ&\Ӝ?bB!%Xt9&RCF9)})~ov=QocIP /20Tr/^"gI ^Iːl8,/S %c,8hQW3m)ʏ wkjkW.&Wf[U/!(lkQT~P+AY0cSW9MβgzTĹ:f5 .L"LmܓXbnY[lן,†0J= [bLJW..V],a# q)͒1H|Dr+z('W5=jt4CF5mٙ&OUAO{0B4r:bm375\C)ܺMb&f l%R,Al[ -Ɵ`ljkРԩC1[h!@:(sr<6ʝm?D:I읯C,E gVf͌N2sc Ue -aA|SZeIIz{FJqp#+șdlpU!2HuDcsZ|12w& ۺ|[ |hwFz5,P뮯j#VYLf l!΀z6`ڃ"pشRȽ,Umi0-] ô݅]M>d7x *cBl6B(TAʹ,sT>wS5/o 8F+a>/ZVN]4mK+=(^jq Q5,~4оTX*֕'WIwʢcLp+CN+ 8*rNYHML]\*t (B㭲DsǬ CU%oȄu%7P)1UdHLRϙIz+>z+a &`I5.Z(g:Ӧk<$ &e.&Va×|U﨓u΢0q%QF9"Ci#(SDv;q.SβҤ/bԠV͟5ZiQm5mRjG)9Z2Y"֨ҔRH\J>xBԠ$BNgQoRvIu&j(i".룶Nϓ ΰ_Fю`t$y5m&a;')8>&}Sf4OW'6?ًs$_N} 4w=zyTZ6_g}ӻw9t{qٞլ6[MrLjFKȍ4tS&gEbIYU~(-T6( Bdk#v{̐[f*~+cv4=qQUL!wWjdAOK#PTZUOxB~1d`KTDp0Xɪޠ%(anI_w||v<#O4LE>8I_95tqNourXSAԠ@㻫ۭ zdء }'Iy&H:g`F &d:!W@"$)n>}s4t͎yvy5{}Tcj^2oc`89)D3 @YFI2N\[&p㒉DrAs"qNiJ>mh{LZ{p"}*8=¬ 9r=9}'&i#II[ӀP4g[L(>:N ({ FCKKЁè:?",6`4`z}%:i֨vͦpB 8zn#64G`a(UJk X6LuDK||h%+_0%vuFW&^<gvH*:dhƵ&HB F9!Io,!;ڀ'|lsthK!<|dNkHJF"d0!(a'D-ɤYO K 'KIQ5r1g?@\J]nRemk}Rz̘kcwjHH7ګ  */ AoGg?/z;xMC7삝ED'z,ǀБPdBCLڔ Yz'Cv\(>#wH7֞ł@CƳzn9{nϖvc8ұ})Lhj&&CW0"6p*wv@9FZTwjMǚI C#yAV-s@t^ ؊.e k#Ӗ1y^j+-,?@NJ[>DҫcytNz(Rz F"j;[-ul$:#ur)Gvi^n2fpM>OCSusT~I?'͎hg0lKvq7cgq)fNI) w+Ti=k`:N:mKgymIsQkk"ƅy?ٴ˧QuKqje'ͯFܧᯱz@tywwIч%p=fr?\\]v|v"`ŌbV~O ?W=7O]<P {5OY P 9+,1d μ! |w1X7/;=GM6j&`4sb7MRlH!5k&|D!:Fnu'EbJt0C* -u| 5'6;{7؅n##GdGTQQvͧ/]No/ׂz<.OB-?ӯ>CAhu4bV"rlPL4r-ZW#ɀgy-]i,=z%Od lYHTR)0&h;:|;5ݰӕE ֭rOfv>t`NH < 2|q0xLjhH91(.{ݶG6Cj~gۋf{_i׍h-yI a$;ZʮZQ,m` Y+VfdNݬ=6J!=-ke 5.ڂ!=1kCU3G_w쥹c~ݟNL7&JO_ތNy:nnQߺ-%0f =^9Qv [10\$]?_'o?Ϛo3pQtuKǻ{7ͣ/ n.7x``qPuɋByN+[+qd.O'riН憐q5{w ]R_A߁nޒn>fٛ*0;Yr/"3 :]Heᬇ (X;:9S/Pr&pR&V-8Ɣzv¢̽| J&s9S؁riHi1FfاQG/wM-@Saއ;J+^; h [C:dKrqȰw QY_'dp^Ӄm OZWTk@DmKů{a \}{~ඈMXߚAm9VΓzհ.57ؔ)k= w NGn+xZJճGŖօ!Y9-Po;-Bs*mjx|[DAŒ%3fC& f~>/uj䍻lۓ?ԜpݚN nV*}s+q+V7)<G͋!uŸ8Ȩ0߇S`x5i5iћv 65iI[qEh/5ʭaG_b#fѕ7mǼ3o&V`^+Ggc%2!* >?|fW!K@s0k[|Hձ޵xz'  g[{ovD3$0AuSɐ`-j}è% D X2ӕt~/ô0`Ut SpŒlR[3;~o谝/Ra/^#}sKmI, 0JǕ)<( e[^ !_(}_Z17}ߗ4_K'q36qmr{K?y(kbUwR/Y]I YyHqs=KJE|CɵoC;_z7sϏigB&,:$jq5na$bN(<, `p^@!, 6%ae3AL[BSDl]H2YZW*`˨0+5+ *Fʘ\`IsNVH73и]sCVj:[NIXǬ_CmE. td܌K5]UJ(ɧ'r cDf8SRmb(dF_v3c mV)Y[?558U`fH!wt->&_m^_T c[i+J}ƌ E 9v6\6J+H!puf`prGoݤ䫘f(zR5g(tDnszmzΒsOV^o}@i ySfWQLcbB$0i$?]'$.cPuGxk'dq#Hޮc y ȃ^ѮW~F`Cn_b[K:3FtyHKiT5<_{)·ZRJ"B@ɓ-7rv/OB56 -L?"K✄S䶓:².HW:a!E(tLE[*vnM&I5`nٶWd#1WlMUşv>SoA_G)myϷibˌϙɱ'<#~Q;d$r+Gnvar >jlƋIH?V*|~~x1~{?Vnf,țE*+S~X]_TRP=ʗ&MͿ@_5-o˴ =RO tZ8h0Uǖv+9R)΍ȔP<$E ]bǘa|;n:[iGmwコ)ģ"ׅ`VD vZ]T;jPfNz@SN}} *# * |K*U7CPȚa:d2Υ'EK(&轰J&gH"+'G'BdGh<>2nڗwXsUSf?{oTE٧2HU˨/{㶍/ f"nAtx뱽M%%Ϙ)S匋xd|sx?{.r>HWQ"ƳXb XfEHVh \4f#rݹTA3V{wllfGV-1rSj+Jmɂ9񨪋˦Ѯf{3,Sj.Q2m9jIⅧnN3*iySz-z7j.12ŽTЮzMxEQNb1gTn'EH؞@ nuH (BBuB /.֕ dgaqx/o+y&*պ>|#;xGuxG_ޕ i[Vz*9W, [t3D-} cl=ilhգOE6<^:M~r53M3TGf/enx'TeapKkFkCۋ!e-wu.wRBH tIܺ}5ZV$ ]Rg֥'P[chB )]ZP"zAn6!>I"D/t编]z#w>}F?؏~Dcr7 CoH+FA,59uX *wbiѸFA@$^U6zc\mYRFQa9#p5]5wk|=ɶOp 1Y.sM+Nv}oOZeZejzMqf꽟c wvX,g"3(\䉽L&JjTh]:*,o_(ZK+n9h)""KsC 22g* ;L* GHdbD.-ʷG$4Z%%Dm]KUNb! ~ dM@EX;Y ɹ4% ,0(R!1Bjk=g@#s` o&4Y]kZ@J 40%aJir#UyNjC\ 1LJ=(,EȤi, 4c( jR2J fyk磇%}-?_ݏ F tD Ca2FBk)_oV9,H/)\ u)Bɀ0ey5isha--2 `}PyX53,ޫ7af3;m!u_F9 OF 6͉(=Nh]IW=I"(簃R @lnjg01uA"+d+!JyäUO"9 N8B89*#<?gtHCNݗ6Cy*{8Xx_؆p!VkP^R qCJXS S5yS3UBWxgѺTnT/^B0\t:a\7/kLj$F emP 0ɽ˦=~2![kryƣÓ˷eTKѽ\A`?ui hG8"pm)ڜ6g]DQԡ6gU!6DzEa2š.naQ%z-ì+V2u&[+}v-/9=Y1`bUm^zkw%1Ț'}CP :B긙 ($y逭i p?W+uCR$>A'JoƼZ n5(8sDԯILNj &sϔ _kO۾q٬_d.fG ,((dJrB`Cj]n`Cⷋ<&tem2G+ <ӐSؽ5A`?{HĢ{ovHS;v*ĝc{lւ.~rq.0I)mX5NbgRE?o}[>u.w;2ޭEYGp`Gj޵Ф%b脽kw8Lko(`FZ%4yqAFl8x|1L{9^XAZWǮ믿`f)bGtA*jFw72 ,2 OA|{' Q Hqvt˲~ ^{|4|V|7w0ث_=JmEw·_FGycgA@JbSq6rSg4;d%3)N5X'EBTժ^zv$bԜzgFv+]j{i-Cߒ&d ,Vkz Cz ZBr)sD2Z>$3BRrb)ǔ} XdP!4 0WPH$%|` ¤Χ+;Pv54ct0GTX7GXY}\cO}[ +1/ff$\ 0m:I?Sdwx[Rs:^ hv;K[nA@Vp)o_|.1\⋝:V춉 /6bj] VBt7bԔ}Lw?4`" EZb EǫD рX45} QK~<3*өU)l10CB^Fɔ$ڍNhX BD'U ˩`iڭ y"#Ss>QK"_|+A2~v;F p 5J E|+QS*d2Tr ҃]rc0:7PSe}]~{T3#$MsD'E:%[U.3x5-RkH9&"1r d2 !̨̍„h/H@0PCa7/-(sY\ۃ3B ́,4F&9KeɐbJ*EXopH=!V.t 24//UYmI)\mA}[ʡX>YpnS%2[r?d4ޫ7Z[fуͻ0mDuU1ocjۈ(vOt;-Oa|֎Ȏar'OmF "][C,}HNA=jO>;)&I+D۵ŲFM&"5X~T7ӬoW1l8lr^"GtO\a2ZE%Jq??i)~b.VxjdT {3[ ef1~?ݥ3YW+pew7v<k'j [_(.%#jS}d7kCE_; dhwu'++_B(+fBj0c9"Ks12`!jcwgV=AyDYMz8Hpf9{-'L8Jn}"#͐ˋw~dճ@S"UkMw{\@)b'߆U>؜4ŹǴ!6.l6Dm _$0И8$H s?Pt)+y@_5wݭ.'|1s@knH_s JKLa d.d _ n r* . [T9DsgSL2s﫧4+NzyaY$/wd vuY4G9'}uHV;g[|z}Y9vηf!c~wScuHwwO'IB޺u?&׆<yK_@йM }*&ƤdwoOE?kZ5pVݦ iQZ҂\fOcœX$IMkl2҅H!۴&4wJj]fX$\  Jv2kM"rNn%XV}l_r2`]}7.9>(vNԌI(tyܦ h[S!~>@+ U,PZ'YGc{q̮E9r$޷+dtŲ3iɼM--tǚ@i'ϣ^<33/9 C` $MJ bS( Q nS.q".<rCxe< lЏA Ҥ7A@2T_xqKO:цƛۛ=*{*5%|kmާm ̍=!"Z8>Be6x͙F՞adG-6‰GC:b"FV[)#5e2#.dM3`׺Y)V^.%AGWglKe⿔K)A6V0qSmV =HHu+[t:)>Gf nCKQ藤{\+E$AęEZ#sݸŹz6,5/TbZnpL50dm֓Iq~]$)d;uy1%B7!*/ %#*:ԍDI>AZJR3\(3!.ԵkM%A )%GG'KSmKE[=DeZU[kScȫXaM%Z۔:<(8]3kdGrf:dLkIЯIk1"+4wOʛR]0Ge`~%iv*GYvV}-F׀g 1YqY]Ln;h'yg~ *\`  =K̾3/i_Ah[ѼS5R:ƪo/47[c/-!Hd;%:Drv7)Yp/!X՗hm=ɾTr"ٳ/7˶[\5ڒHtSB uA)*%5qLDi$nX L G"1(uI/ @>4ڴ%z@da4q8{Ǖ1YlF9ce16 U"D_>B lJ|7|~9$zH3UJ9i1íiφ}z|ծq.S{|E7^Tk>}{ d* !@ᣟGfyx|}C/(;fGb-hr;H#)CCf0Gx ( Nfp1i9X (|o<1\Fz3/r 9RôuL!EtApy2&"B'g+w=v%9ӕiпGtT2ޅ/Bu& qF (,?'Sp>Sp>eOX be"a-gFpmB-ĤEO_< h|ˡ{ )faaX!#db3*r4lC~)nw@)=RеRj%]-cPjI斢BcWv˽Ko,LN]nx<2jB_ehVm!Q}8ͨG/=g}  Xԭbr v%b>5*ӑΑU7WAUN/z3Z Lo>:nljʱTZ\TWJLoCjPT uz|9_=L?g;TӒT!B eW\N58"=.V,8%mxz=D3HC'<ϐZИu'(#J@t5 Y/6Xoo`ԟ"!u+*?vf6M(up?-zg<*bH$Dnno-5d<2E: &.*{gkn#2ϒҺ7! 3M(j\Rl#[`Il Rx fgbQ&Vv yi;lb`ab*bD$aM[wc3pqkԄD[ xCV g 3k^IpRZ`̱T^<ŽR LƊ( !ds #HM0OIS$q4\H JF/ Y?YF@<.bH%Zib0M ;LO ތ#mHrAr@ {DHJ'31NVC[.f p͵Б3eGxJuLjx䀖I:2ѓgZ*Zk\wXej&sUfuTI\/Q^WГ7gK{hsH+Dq:ݖ>"i{CaiG5-v=q+[}D YCrA`.s4Bu..zhݻ\ܴ]Ve[7Ҽ רK}ӝs1?V(EI#XLp' {sQ=$}b~WȾG;g` (ք_幜gc'D|}s1Btd+4{`0t5wbcm]4tm4ΈP74w*8Go9NWۄq%qCDj-WPqF&T2Om+$[1wsa{#mxv1mj j1FR$%-ð%٬UwUuCElU&!Z)x0y.  S)}^`|d3IrS.l洕(#ZuxS2 i*2g1jX&Tbzh`U#0%3@dJj#']Z6Wntʺx=Ze;S:*[%ufݏyO!1/x29R Ta Q&CkHk s~~Vs[˹{ֶ{ǒ6<7"͔-#}֟k-YW~z1By˩{sAq!)LǏH5T.6w cT5RAʀ.MO. Y"l&d{8%)j*qC**-O#/ξff8e*SE;j-A ny~_Cd~%vHX)n01;>]xRYщ7)usV SD1ŴlgrTgN)Ǭ3G5Ҝn2\pљc߷hsT|W1NNrb'K: XV`)'6}xщs'??}T3zI< ?{ocfgo؇ٲ"U!G %o˼;ED4X`f4&"CJ MIܾ틿y%+׮X^ܾ|o޾puQ/e_/W^Çw^ZLv/f߷JMv+je>ks1 ;^Wrm4ˆ%i>dJū\isZ$3'b`:u_M9ǻ5`ŏ\quz@MH]`. X]L[)PCWqKS &0 kEb&酿):."aa F =OϦqe!yәi54×0NM_F,t~Em6ٵ7`͇p6|xVCk(L3YU{X[mU{^2<*Ic1=Umu ϵ2^mM\իh9.M#ř%֫] qϙ$R9©Tè2ZH}=);>ү< "~LFS%kpB@y=Gc/dh4U' B)X %6RVx|ltJXXe{oG׌C4B]PŞó6GTanqxLޝZ$RS+@[ؿbj(V[q($9zICV1:GuiJkӢKCg mg-#: w]2_m54bDD0)7=eO9BB&տ\&&alK&Ak !\O+FL$w>-WYL VO`F q줦$nl얔h՞+&@@0.Q6jЯmg]^}geμ-XWJ-em,ې-V λ r4tsc9u+CcvP5(_g'ThNY/&N.R"}$wًQs}uH+H-"[L_WT)q˨#a* kTjFN7o0gߢx4[^Lg{] a=}dGeJ;jS lj(V)A(hPZwᨰBs `,n6˿-ŎBϜ㱣z*RXy"Ǐb*ϔ33`鷢G >wv}U6 eÏ+c/hK**+$kK*+^_CYqVևDcO~3Ŋkܚh+cikj;-> u"y&7afIsV g\0py@L4- X@BHF aX ^`$Qa{UJ@Dr6W VT 8<JZRHK+_?'cD 7KP3gĜ%`,!xI & `4 *Շ%p΂YfL`421J#(LLa0İ9e FKyٷ/+`MWRj ]Ndٖma<>ҜRa,NK`D2yu; _^9E~S{in=&X r  UR;j=o~m#YA/$~10O$X gm.$;@H{sSˈQMRˈ"_UWWuݢr8FM=c"k 9ul"ȃp3*53Ci3^^j̝W`d!+ׄoaM8[ +I@t3z ĭAM꒛gBdjfzRNO%Lt< K-E q hKz -mQۇ0e1դY0uye&ag &AW5>6Oo|MRTםS^ƨo"?+Ev{E"}t"n[5r%6wΦ̓cEFd4242 A>}7z\:< IG^!O/,4pdO]ZEDEĵl'~Dŝ=3au~ ]+}F&b!Wלiۇw-0k``]7 kO+,tC3ӥye8]A,r2fC/uy\Qv ~WtIDɧϯVo=Ϸwzp5ݭD(o1Ƴbx\6ç8Y!fI:_|vCsжY-]}Я]%ǫQocM``sd^ddGvקcEZY9-`R].}!p *V- 캫l*ʷ~{5^A_}\&9߫-;[:%JTGɢJjkl,I[c;GkzHglN-N&aIf$ | XS>=h P|ΠTwAtwk.ڢ 4ka Q@^} ׈5e5eR$ vf\>Z74Ǹ Y6lS1e*1 jظ{լ2XgnH1DP0kjNTO]R,MG6߶xڼ'cGW/zd<14167_{n=bܿssvMڂi(1~2 ʃ~^ws3_h@u%=iï\ń.Q2[8nR>hX BD'hŋv@[,!SiEy/a:igK׀V Ԟ{霷~O ݡ_gp%{ 8W3/}aaGi0po&M^\ʃ>QR^iN903  *BݴmR`8oi#<>ؠYxn2MB}9@W q iL %ڤ:a!8)gF\`G,"wztSm-B/J֖.6^9&;  4S$ 9"f"xƁ?zI7IBvBԕ8]}! 1*@ 0`C[{AzM4s7xl~l'X AsŚQT%vAPj/\`,Eˁ٢͕BPt=6}z7-",O@[j]H,d8 FGԺOzq7 9QLvpB&)qPЋdR٢/ABvd'Pė7޺ W#Z 5T+"9=0l%*OeO4U9M4a;} !=;m u^3 VJS`a$|R"v^Hi,pB]FveRKfC2FeXh ˚1XRoox{o@'uqMPb(3E C:LGSX' MN1ץ6/ӻcV]xhn"yH^HC–v9l , E!XW;٦w_fJvzX5Ɓjmp߼m:'F&NWV0n{5=/S%{ »}6Ѓ>8=%Rt0ZVcU{k]jOGS?7i?W tqEK&\܆SiÕ\ڹWBs:EQCNrhXְ||96CjjKjШdY՝3Je͹"0[Jģ19*8G{]sK/+c+3X4tEdKzLr. 4tm~ԏ?{|>-9pdngKq pfKMNqi:M9>Bc\ js:{=^ڿ;^?{ZɔjwPheID|kGAP5Ld~~S-T!.j}4l5ot:AߠђcK |>3_X_T776׆bm`'as+Y1YJr KP$2ddcT9\ɜo_oF~r>jX&ClozݟCyk^_#4';eVy$벑_6rFn|Ml(`ʹf`eRTT)|,h뙖TuaJ`}P>*̼p"s[cώQߧ^!ʋYgh H.  Irr-dRHdZI2GMM.M<>f }$9f>9&)BqT Q*l.e*oIQ&S-UY`ITKҗSVD7h9mHZHi*Z @٥hN3biTRm4K$犹(UU`:7.32b3!@Aº֤ S'0L9r8J (nmt* A4ϑf&Lֱ!1VQw9' Hq6;).X-;ۃg@fn|ibM~䳠gs;>\_=2ôe ]^%i5~|\`[k;pһ  MEko`Y5_>^ZCoű䦌] ԏ#d u؎څXo?fa&hJO tDq0f$3R ʜe 3 &jKY 0 Fd8Kበ 8H(zqF4mS l=0(g~~K1%LPTshl sXCV`HE`V!4T `\c'cl4QêN,C)c:w434AK)#xӄIP5!ibJ*Ev[V>aCoEp3ʬTGv367O%VJHc#-ńK] CZX2;]㗩&0-8(QBQZ%]ZKMfrΩ3lm˙8gt\\ƀE="88:se43Ekʄv?;u) P#&AP*o6є-tmabC-(>L(o L6jbX7Notby}^f+ĶSysgV'3~+~f/b緿N@tLx \T' 'rwmhgB9+˖-˷>Ѕ0y&Y"4n e4FT3#N4cIbFɻ:x|՜S%#rQs'KiBE5x{.7ab$ ܧJ9\>".c:gR 2IRbA"$IIT81iWw+d} dT%K Y],=]p,)Qa"X,iNdQkgg؁w$U(: j 9D~~܁P{&4ͩ"]ff /\/ѲyqZAdT^EВ_5ω" VT<㲘":dJ݆0Aܨ6p 7J9 цGu90H}h <Q>-5Pd{j8mx°Wb-m~1bv0? *|PE.ԕ4`F9D^NΕ/kq`,r~ȇ"],vޢA@zSۓ6YmiL,7nd< $ƟJ/8JSy2ve*c{9@S\@E&*#RUPuNE , I` .PE/,طH~~]9\pɃX#‘$J(a ?b)C-8`g.1ZP(CJDPJzu-!Fy!E9ѹB]Bh`.8qhIpYY9RXfEJ59K k44/P s[@ [X*Ġ(($MsH6^ΡO(FbTmpdY -' RVf f#gPίM0]^LI*;KzyO^:B+ug 6uӝbA벧i`HgTN Æw-k9u[}h6V'ڏsǏ1Ј;DLhຣBXuGM(x mXHdupmFz2Ru (֔≠ۛw~#uF-Ey&ї&OD(p}x Cd (gTv pg Byg! 8 ;cX2 <!HޡoPe۝vg0{:?H|8l\F޸Ĉs^)j&b*M^p ך؞č'cruw?oh&/.Jp!$sj 9ֈRAAM 9nKaL-tSy*n" (n gY lA=u?ut -g}r r2;D6ԇD.brHJv1tԯ]_'0u8Vʜ!$ѳ<9 6J@wg6ӈ8͟Cߝ|0?φIĆg=K>kx7߰G׿ 'lhAh7߳h9\t%> 14wl yFᑐ3F(Ah7Cҷl.@lY;< yX   .Im q31A#!# A`1MI^S"7u¯u_XuKcZ4 ; AIțXf!lĆPpj뱞Dg8J0g=g(x~ AY!Gfw8v:j°zk 5 %O|j̬' zU`Os`Fa3ykNAjݷ3ftU.ˠ#)/o*zD/) =8 04R[q\fR~!-#m3()w elp{wS'n9w{$_t}zͦy=@Id}{=n.C;ޣx>7D6gd"WP| kVfJ̏ˉd?DV5_ "w<&Lu>q?[Pk+ȭ]rZ0Az(\> 8[ķjފXg];*9оPr2t83HSL%Xeh 2QN:}貿mɧ_o֊`%=6-Iؿ!7b*G#}8x.bq̝3{Y>Y|Ռ(eU >(׊Ϸoo|z7G.7Ds}Yò@=R݇'wCh,^?,TpȾΥG+~'Sei:O[6sY11?nNi +=ǝq)_,7.'T.BR+J_|R/zzqtcԙRâJ,jbWY'Tî闂]*uHBE-;$d:j؅+M+JV:w4&F2*Ņ$f/U.aJjcR9'+]6ua lesX=VpTV 4F8ER6r71'P<;5zώZ8_*oq-o|x3S>.|涖MRFwz^"R}#sE,T *^) C.E#V >Hhm Of`l>r3ZN~So@. cߏR*MIAD ({& 4p$הICp lÙ SҮGv"srԤ]:OK;ݑ&oAkxؙ&&&SHN=(jSUae׊T٩,*(`{bU{%-܂EXW93ίH(HI9rUG,()8#:|FY[(2IŴpҙ I,Zf}\PUp'@9#D[/ c ҖQlQvhGGp{Q(Q1bO GϑbL5+P*D kKC 2, fFNW^%p ‹U גUB˃K1#wjdעU0D4[>]g/&jAUB)YHBʉBtOX^4I&! !7E&j >'k,]0*g*(ka{@S;'`|K'ILsڥvu%<܁dɲk[f8NJ)5;!d|aJ0/2omyqf?ݖ2Nk_aqc:-[QCQqbn>#%=|{Z)l;R36L^#&Xz7bS :ԂT z?ogP dxdڪFnF έݫ-qk>jda `5Cvl[W_d߂ Gw4ruxD+RJY)<{ :8khȶ"-+`:p{J5/:ΞƋ׽"P4"3Ӕ+lw9?(X J|.8iU* A|X"(s~ыrJ2T*ZEAIU֊ * KqA$#M2BD`V(B@91+GBUqͺȃ|[XRZZ-jdq\mW@m7 'i}F(2aә%Z%$a($0D#2Z%3[A@rl!=lWNIEw?i"ذ7Z`*^ͲT3(*}ZQm!uZpu+6V:S0!n&z@} 6Z,V)7>ۭm3?klg lwM;z2-]z?Gcd_燕U3ľcE[bf叜[Xƨr<]~\Y)*A#L qjvWҳ%oݛeDhL>H9Ab":hg8҃ޑ}j3@BBq)Go'k7A[,!6| ic@BBq)`@H18Aľv; ؾ[UvK!!߸FɔDv娦H3M@%; G_+'O\ڽZ|]}+@qywKz~Bҕ mS/O]~OH'] ңb!'] $: ߄ q%Eb)*~O )V3.O@Z'\dJ^@eO )V%' W?Jp2H$HC#=jveSf-|6˴&q \ Plڸ%x^:/ V?PdA@ - ÇhxHuLҤ_~@UYO:Sh*qw)᩿{4O|ɩ|xz,[bB삕s.99ȯaEXi|ņJ?О02&3'LYo F;b6RO;6ɹB`?^exc<%Pz٣+DO^epHO^Lޫks6'~!iiF oCq N-`e H'eղ(Q$1WkbZ)޶1%ծ @Ϋg&Q,]j6|eE $ " '&H:MݭVĢ c>>f6sz=ƙsmcZn|ooLz(Y9AߐZ9hZ9RF`Z` o5iJ/6j@4=h&Wy0faf 2`1X$hr 3CC0FIN"Pk+ȭmA"h¾le9=e\R!5UJe_3z1S=.9RB^1؈(x4*';N֞R| >\ d'n,U,҈A2OA 1fȹA>n.2mtG30.b#FV[W@^iw opM sA8dpSz2^%Et;Vs[ŽA3o)?`cb VIgbǍ#+~KHa@ Ƣ_#, {I.9Kΐ3av-X.=տfqg9ªGfRn㽵5*u):Y9\$3X 0DR=w\\0Ksdc2؜-vHR)H0(#҇9;= 6DǛ YdAhR,+,f ~@[B9 9*Pk 6 Šfc/50W_ 6"V|iB1q # J46E@BPwZ m1X8(uȉcǓ`Q 8[iASТOk$2RСXciC錁I$e 쒣  FK '/gqI;N>LRR(4k/=S:Đ ;+e5ƛªB1gB8P-В0٣t̀wsbS)~ fߏڹ= ;P?Š fSS4,O`0DH c@(!ƚIM\+NcоJ*W{!mD {6$c;Yw(h-yzW'T++P>ۏg35s䇏ϟ_aF1>?t>"4]V9&?դ?.~|z} _pIvߞܾ4BI`ݘO@v&H# E >Xqb)H҃ SB _M~5KFnc`/WeO %tsԇ<,7{w0j]ux!Z:_} |u&y9NvE:mSc+%=6cb޽Xap`^\Ľ cW|_P ~+V\i5 ͩx׍̪vy 4}0hU4ahc,+c``M5uLO,uLIlUJmH:$'SAdMe肇J6y +,. úNjgxjT`!Um/|dkǰZ% kV0ԉ1Z,v]Aq?4fQ(7~ZыAyk5ɞuRU.QuU:94֌qrHxd`)ZxG2)UMP{;g ^7UV38Rk$xZb <8^3;R蝃 K wV Ψ4kNɌInt/-נZ8 ^)fyz bZ/ζ֟jMU(8:܅S  0 Lgp5uQ#\5fXTdKe![FT muM}6=5dzL8U%LHHr)"A󻣑6t*Р=ڷ[DyVs=eʨ% 0r 륐`!cKE; 1R k 0`+)|YDRu X(WDRpfe +pD.+,H!7AGHz4Vw%;u Pl[[Hl=I =R92{DTCI}K0k AUib[xdS V;voFV*VbWb@+>m B){f"5jEv"{DrHޒ6JQϟΦQކCD Iknܮk &v]m81GdTkn_}OHxEr$(|^60f?x(++"JH)' pN Nc4Ngj(׆"=2no'69v/[)%D)!SKl(Hxgh, iR:-nmR)VcjHB؍P=aSh6Lܵñnj2OOlfM6I>ꪱ"YCXO67}ز6wr1!dmށ|V*3~%}4 I#DZ!O{oVr[ ni4=Cg: Iee ĩkU A , ^j|~z\M0)oZns쀇4cNyg~Aũ=_ *F/),dE TKD3hmx+((ovhXBp"8mTư4/9nZ,_nW ,@!+4}8Qځqb8+!zeDI87gP˺Y.%qtW+-䷜{?I;J쪬<,xĆRp@6'F#_ēx=R}rO >ش(UJ H9,AK839%Q2}uBq,D^В|&dS*Z2Z(]eݶeZHk{9ɵ9DF2QXz皖\\}8ROy݁QO>~?I,+'7m]>D ]5Dk}^Ix;Q;ޕ74*)S,vQt:H3=N*KuLG>QLXy^όW翇y{uyMz}Ѣ/NUZ4Hj„wBA|a,c1UxZj\`9 ;i=j-햋|Qw(֎XUnSE`K3f~CQ&lzEٴkExE.ezCqV-3n4>glhй&KDHi^9Qd@Jl9,z[DU33o{ŘQ}>A&DA~F9%Оa}s_BP'_ØF}k۷Hڜ6յP0pPbJ1ES"xܤ&j%uCw{Q⢵Xapf"pEJy̱+pT;,+vVt-lU~ا6Ƃ:ܩ؆ $DvgwtAl;/59Ϳ~Z2/R߱ïUu1נ}O#/?Ͻƛ_q*Zz֢7/aYbb-~H+ҥ[ƪAbDtRFsպdnmW.eJ "vݫh|褰ឺ#[]Qq9ݣz-$%C9(:C0"L^AJ$Ac)-֡Up7 \ {{_)dʀk&(R*pLz̵VD%D]B^Cԟ_0qÝ9xSq%wZK!h7Rg͆m (^b7 _XB6F]Ӆ `T z*:ݎ0< cW-UuoiY?@B\0V}\Xc)@~A ʧ7y*||̪7Ҭ0ڥ.qD -|֨`2p {DXeyf 6hq_ o4T)3qi }p]U]KM01[aru8̓ ^Wルw7snw1Q)ͻS&|kM ||ZѐN1CaE.J3 j] +I #cꄓH qQ{(֚@U59mТ:՗5MӜ ̝a*tKĤek{Ae ^j U|$*|rVڜe-x*b~7_| k6jL& "4A{\DRu{]!0a[HJw ZL+T[~o}xR Tf j١}'$U:֖c0<gЧ1.-;v%m$dLF6. P1=O4C] uS}r܇ot; 9&,8w^5\SZʩM?fӦS9Uq}sT`)Wvv,$Rq=.veH|;J n1 MZU<:8*htf>CJD#>WDhs~lc1qݫiV$"cģ]O#VqHFa.;/D3\"%(Բi+JJᛜpEwJB%'&ZdL13OLZg7Vo=M&C\Z^=spc)Og(AȘ'eϟγkyTR\4u³C .D{˶l~VOn:6ޓJT-i}J*p<d%l8m9CӺP Έ׌:WXvc87!X]!8`Xb<,Cf(OR>+7ZWfL[e MNp<0@,4Tiŵ꼟B,N&' М]T0^QۧE3ϥE UE@Z ,S U Pb:1trxK@2"52`1-'E %m)2UՒ{JQ݉aHJ+;be1 &s1&cv\|2:7VѸɐb6p}nW2 ?b1rw <25f ZԋR*}{ ^X8&"KM: 7Te>aE0ACP⎊64bK;* XSכu@4B.Nj* Fj t v>/*7Ip_k+mH /yQ<,k/i!aiWIʎ_US5xpEBÙ_UWUdճ^Ĉh/LY7דwGBȊgVI>Y4kx X,~+@tXv9ԧ5œ݉ko8 32@+k v‡6-At?IB]8@ 73yFܐ&BC ulnM&c L23 MnFKv7\ pQ"RCSpdl}NQPG+!]:8F D3s:aU/`zH@jҮO)FɥՄ-aj+̂ WJ;|;OoܷaxT$U3/'aq{z Ox(|{' &[G/q7'0Y^TK/KF5x2}oWW!ߜ৴9Tse'_`~e%s<~#GrZ듇-}=`\rS⫄V;sJvSic6fMbi-= y9.JSYʔ#p:ƐK܉X  ,ګ6@Zt>J@fh DC 2d4$LPULfv˛mO^$Zvmrt d3rAEp,&xƹ 7 MCWNQ :3Tjܞwz}]2P+Zyu[Xѥqk^`/24֯2*zjӼ<J<C8ja*qﺃOCDzcEnq3!^ro؊Mۜ'S+1X5JQ|vY.(ԝ];L:ew?%LLZ,FXL>L:2Zp>zS #yI]ӷrw8Hbjqejy)1 b3bSH9SJ9Na7 RTPV I6%E)3<*B,¦^riWދC-!^: '$UB|\N"˛&YB5x9I_YAzD?ҿfX`-MXJ{DUS jЯc@#Mvs>' $q؊ճ=h8UUKս@ R=☀t`OŵRxҀXJ缠v]u@4 "aw"*,&{I9jbOكBZ9@Jj}rEUNuo$~{%ú[8彻Τ=khܘ~~ؚXN ' n% y!bC%&zUXU:4 tfG^gyFu8$PGfEr6slGO͎{?e<6q,xƾ (1Ց/5&~x: @Fɕء fl}WǻbPm~S x87tӁ)aO6-l~F46_%'j%[-3LST]ySwަ<^Mp5"K!r!JcFRGH^E{mE+H*6C^f2`=`s(i+Ww/o.e̊g(3ˌ3_un뽽=TQUSs鼶Z ^)_KϚ!գ} wodh. u.Z@ep˹,Uՙ7[۰y[~pvwj^@9LkGFGZ6p5;C[ƯW1t>+/P}N8:9st;eK渿\F9WMsΫב=`!CIvFr17`OZyTv_@jm,iVQ>8`q-)mBwiF)! JRuQE1wG kTlT$#v%a fT -9=z3|&QGv#VAԱf}aZ"O-t_VϹR;qÐ̪.E_V(xbp~ʙb3L ˡA/8yR@Ծ" .9f+*yzy ($pK.ش}xKiZٴrc= %z&U2M3?Q*|YLO( t;7hlɬqlYM5]uʦ͂Q;9ҾD@CzDIt{Woʷ׸sX~(ABt,_3{%+D#6zCwiYeŦxUL,>m [/w=8ɤxxE3yEO8AH߶9dӠ +{6PT^CtSys$?AZ/8Z,Њs7={ [ʸxQ50x 0ӝApJp=/PhTu:LP*ͿFX4=gg;jbwowጌfD҅eLCDٚΜ3y:;s^=;xzn #!2^N&) NqNzyB){|~F:qyi2<# t60=S? Ra:{hz*z9]ZΙs,xU_%Qy͟',Oo'?T DJAEaKrҁPh^3h'"Ej,r._ 0e)yrF cSjҀ0YO$yBHRD$Z &qӥ/ MK-]L<`9D *&@(TKmvh 2jrFIUiuHoQ5j |u$`BiW?N e@-A*KaQ!k &HFS`QT*P3wwܸoӱM%3B79WCܡ`3<>oX"iA;O*Lfx;>^^`\Eʏ"2Ƿ'hW'GNvu$9Oi1r6~N\\N.s?onD 9~=~# TJɃ>O8.R(ηqF$E~"'J. sk¥C7ݹgz_eטG2PO=='/L5SJ>v1}+%eV1*l3E0 Ajt ( C#0Ǒ@C*e }䰟ʹ@l: +H(ЗM <6!v.|QP GgP+[ߴfOO9O{I7n=%?ߛʟ$ e  <)l *^8*0H0+13P,W2Ƃ/77>\.(cy{e<]1`< I;J~fH6mMeA{@nІ`zbA{1l/"4&vǚF)'\hw"gջ_L|o]ϐ)!LIE>s!_2θl("m9.]^]{$34Y9sݻ c䌦huPPցbEd5?EB(&s$s`)rS"J*qFX=`}!qQ`#+YNx"hXR"C~u(쀕 Bqv(+u4ҹĜ% G31?Td.p̾W@eiEĘtՌT14`MRJ7%]Oka5Ym^փv/}MR7//;OWt?_Wih g.RgfL R0Sٌ,ee}S5`FI<*!o0yq:18d%`Tݛ--g.F'@)`hr5Se 3P$R"䁒Bz#9J@z+*x("P KZJ) R㯅Ȃpnu!)RQLQ"Rj+%1U8̃]t"_cmku׆Xb­>"JBCc(DQ&fP4gGlM($LP b< 0ymBPTuQ ?5֡pnMP-$gkHHd)=E9V{nrt`#70L"fOP%:%Fh2NPG` QMi ?\572-ɺ)L`xEji"P)یW^#qạG0J+xkUi]tSJa OԊҬޒbxsqj KiZק+Ag8:Hhkm+pT:Ѻ)qHc{6s38Bzw;5'ZF Uɻ4N:(K@ӊjI )fTv ܢ:Ŵb@M\")Fbsa mPbL0h8QY^`d%Cu%Nvgr6z #/l}% ZPҪd%4;3]sY \g6KTPeb|P-šrǫnՙ]T,<Tp+۾RK8yH?rm!<[ӷ"]oA?F~OeUo/cw]Nh<܏1S/}\#~[p$eCO;v|t5е"nӏm$zҹw]:^N z|[״+Gn:J<#JT+zirlHkŽn̐ffooɆ=J"(IUL`l J"hDaBb2Dd0D(D37ߗ6L̈́p2豼Ǟeoɷ~ ̹i4[k Ƹw,Dq;V; {|q܀|V f3KƝT;\ƽ/ƝQqud-1Ĩ[E&h==˿'{Z>87.Uɔg}We܎mDd7՗)<\p|᝗_+g*' zu@x8mȁ3x!g&PpE ;Rw\;[Xp4Jbٱ)4_vZM6sEDwHTR=]dՇwJ:k jl}Z`9]5}. U'˫B\jJ-J/jׇ1f=<ŤY*˸Bu෋wQ~|i;6}ߧrtY /Oe#maF_9Iz(uWT8ГMp2Ց]SPz\D vڽK1Ū߼[(&t7W̱ŷTyHFjԃ9;n(qNIC2t@XC|V0~M % 1VNwᢴĿdIɄHBO1RE.rAP#DòD_l!7)*pET88?މevb׽{V(6]FQc)"[Zi|]pVpPhVWZC Ri%H!W.u4שN@w9xRH =01r B NHJ{\7pA RTLT"ػ j_-bgסش#k߹mPd/8֣( 7 /m5VLFȈe%k߰rZwymf0> t<+zFzSY=ysC-ݝ:EP9VOVyQ9,qҐu|??iTU8R+K,^VKL%έ,-4jb Qu]uB<5rI}˯{_CIHrbJ@yJyk(H^v5xZ3p6hut |88cN8UZS˟Z/KO4SOtBҙsޫ9kKor:o0C<_f0lDp[37yǜSjnr5tVY00N[uSVtI@S;Onx)CUΜviOi4gW/9`^M%z^\tWVx+;않GW`}TШ"R4.z]6meyJ_A)?ǁE%̞}^0{/=M#><^:aTM>e^+t)Qs{t!6G~7e麏0} k) ya-aчIuJm&F.qzf= dO#Ε"ݯ/6cl&$^VjmEAl =0O=_2WKFrZhdEv\}COkK9-Q7ʹ36l:3&ȅfu{lҔC2c{]W=|eXF3RS0-D#"" \Gj󗻵=G7ƕYp3+ B/Π1 5ah`DG" kgpbrE=B(zt9@1J.-2eC%LXm'LC ٠8R!߀l8XcXDI}h % s boJL*+?Wӿ\O=F4t?--./i+6Aрw,A& 4K{gcJD"g\t6_O%-M8i6]bz:Q m/8쵧'$-_tmյ|(4TD)%G|sFW5ca:D)ЎuNmPz,$lG^AcM^BSn,mBY(G8Il:7?E~(N+@A8 I/v@V[xײ 4T0|vvنGW`}m(Msy?{WF yа=f a!isFא,fF)$EK!Sd*Ȍ/v~0Jҁ诓4W{ j,4~v5{p5~iRN4+Sp'7tX,9 \/C;aYdɄ:797/(6 ŶE# g3qf b m.:g4'8%'&Lenz%h\s@#ְ!~Ǭa5%zFY/D@MYc^R*ѭ n[C35 (~*?PJo ~ɡ(9azDZrJaUW PcR]vVR-Њŷ(gzڪhH.Ɂ[ҟm(lݶzf{-r2U%+pօ 3j2kBPBvc +M8-egmPO^͍%<:q魷93}(-&qqy>iSvE֛SmԤo)˛N%՘3tĝ_ 謩lsyYll\j3WoxݛZ;YXlRTBkΈRH: aܯa1RgVn[ȔqͦE9<2W2bKGrnXyctk!`<䟴ɂ3Ʈ=iox's?:Z1' Y) nNĀ,H-NVr/g/)[~^~k.W}M{ {Pެj pQQ.9V\Uge=PUu,ǘ..\;/|d(;O78d]Oaϫhkxɇ&Tw rbhB( 0-_TJz |4ncR">=|KSd7==w.Tx$E5Fƌ+- އ1"Mt&Vg0d]<m= )<m 0v} +Ti21 p \K̆TZfD&۫$.z }i<]ױWꜵtwoHLy&c, URS"fcH h`}#'>96JH,a-$2vXjHvplyN}Iw+E,9wDtm[עq\as: I4"tQP IP´ GeBuV9j;i)[*%= ьydTaԚ}`\MZѳgZ"$ȝ|2KKl{-ҒnA=XI@7NMܹ_c!,"9hA>W'+^\.RwpB_jiyɏu{H"|/a2@ ɻrΗ.`@ ^I ^r'NI=k,4q\%ًpt&~2ٶQCCBȳ* ^ZRٮg_.od&W-˃j74W7AJJV) z(GSO$bDGbꩵ9];BXe>AǴ4Gr2GJŎ3gkFP@ R-M0F$`Bʱd<))?$8,(9ٰ?XaCqa»`U7^H$</ˆN]ǜetR`3`U#4z A#DthxgױnT vT{D%0&H@yEHI!u CAxCG+23EH{5-.`LTPAH&a"j48xh Ga9J,2pb)܊6AW'm2mJ^/_4uD<`Is1ZƐw f2BhpT |tܷ-';GX4e^oP}dӖfb\=k?,{ew_B`%o|~R`~ ],}fȕ9kSD(I/ ]%)_xZ WΙY X%`E*I99G!$,9|1+XqS$Rsn#gBp'[VRRzB'g` J%6M/7nv(nj!VF(5)00LS˜tG֩ ))IbaZ+p-eGkDK mP$*æmח>B {< N"mXb#ȍ>3l%Jqi* Jn iXG~H:%{i9{F$snP ]:Z iG qEr2<G3ˏ9f\nTnf&m)בZEɽ'8c9+ŌCm2n8\Vr.zwiuqnVZkhTe31H`r+JJ U3/ ) Bi$SHD,mfRY7 X()f= ݭ8͢Ρ;7K-Zk߭I1BĀq0[ȇY)ށ*XS` ,HˇK3Kc]i/5\W.eC}sw{"̯2Nʖ|i㛵uIIBmMe|n/òx ,r53T xf1׹bdMX\/.pȻmߛ%b{Ϻ+)cXTI騇<.ڪOcX7-ukANi:p7T"6uk\Ey]U +ЛKiy/!TXrN }qzPPqאJӫ4&b|<9ܠϻM`%D-80Yo9f@9\Z,fީu>xCm/an>5dop\bt25f)E$2æ u.c '&u¦鮾æ3æ]'E[)PȆAᦩT(ĸT(9:@Ж$,{jbykz Tt:_REo ns¼ =_fR!e~o.|Pw߿O$$n*wl!Z A:ƫ+Ѵܧfgi(:Q|eTA* ʼnbv\F5CeTsjrmnwܡ}5\Pa*'Κ]fBY`yZrx}1-u4R"08GeXD̂ƣ"SbXVCxZoaI9b3ń K ZƼ!Ul5Ċ#Jp(Q\F7;+hTCm՚ChCp,(N#<&Za%"U Y-.#8/Q D:Ut 0JRR`$#a&"9#}KĨ::Ă9zC ‘`tW .b}p'W"1xJi fVܸpTO뽽pՈk&1Vg;T[qRӇ[*o " {̝Q(B%chan.~:C508Vݔ: AyY evzL",r̦_r'QC`  ULL`f eD&rnVO>}hz Bv\t"Hma3:rǯ7+i,iԄŧ/}uׇ"zz0yay}Z]/L׫kx/a5ju LOD^*)ًu5'LanvY|VO3/8堗XZ1c2G5,zg"%K&//\-O~ >"d|.x6_ݮ~ Oݯ-_N緓±HQlS dl\d Y 2)3ʜgi0\$vs_&gbj!))ҩ{#4JȤ!`J82:8C2ɬ`ZM|)eC/.4 d~I *)qB25H0Q;F soHHn9%da%Dz":pst>`k .Õ{5IMqȔYT Xvlod*J5ml4ٶimdۦe6C+9(ʺ|D=ȹ#8r1b+X*dz=i~|;?9BaVUY3rvKu3mnF/W %Jlonnˮ|9 lJCRR,SAFB| zsVcFJȠ y=NAX y-nI^RhY^?>x6ou0 5?HAj\>Htӈ=K]˷G ST'cAxMʁhAM5HZX45Z)pqvk!>D@He-j((<ͳ8t`XR&']nO49B\Kb]v)+EO‡iJdL?b∞Z$2246ܘ hf![ɩE 6Ȅ8,aFBP\Kbˁ~'}K55_,!}bǨӚ?ZRH+B%e-5@x(}q9e ID!VheacÆƬ;2 ŗ# R.)\#88']s yG)xL(CFA`4"Œ!%[ w@4^`C& @xAk:u*k&zqs!8Fzz =AƈWdt#fu# pbr.i/Yۮxr&r)ntQb":cn=:iOKumk艆jE4F8);d6/0Jf@󄞾A[mfݐIJ.0faiZ{EoXKXتi]Kb'pn[It9=zzCLAM*N%uQn9I"c֐kKB0z c;?2V%!o̗mL#vx8,"@1eÒFw.8mVvפ$1I M4XL^DFztnU Cdw}v1;":A ͜)tntTgi&fbFPBI b4^A<8ܲjcm6A׿ON &T2~E T=Gz  Na$Ml]ܩocX+,=OXeK7! Q:/c;B"Q;5g9fP.DV"迄>o/Q/&P> K#Y3wƹB%@)h3K3qJҼ9;}_wtt yZ47~EsMyєRZfހ ,' eAc%$7נ9j0#\~rI98?bCr{iٱ.< 9Ӥ6;g/#Kȹ3&wy鮚=?'EFSs 3ϗ &+јPIƂX>"d.!{ݖKN>v68H|KѸd| 9!2 OFK H0prFPB7@*.VX)d`^eh[.gd;Yk%8SJEr!b3 8,#uefFD&%AvuX͐8 *-δPA`(T3IE.!;ND-L;ŹE@ tRJ4&K.3\ g[1(rA2,%a-e9]ߵG- u&I9#MP Y<./<8]if;F65.GNw.}9xu2]}~`?VoiC>'Dgi+ǮK[CUihz@{g'|To,J9x =)i[)N!9Wt\&%>m>wβ_bji9D}zɠ8kfޓsm'$`H:daÝs):ގSF;qOC,c^/vkDIV9 e xr4 fѸtlKu8{,M`L&fm` -E@MŁ(K(9(>+XN#Jd $c>+ƺ 2JaDtf>5v?TXnHts15љϩ0Ngqݿ-Iͱ5q)kxtzs&mP,ɡfx>nc4ߣI:MvY>-խtќv| \;sN΃vdXLNk E,x*} c$1&TR/vG1>ӌ VZ`lצ⋔=/dsqF,翗)ӆM쇱Z(f$)))Ek u gn閐 gF_lN5jPig mF/cp56}@ uW{_OԉQGڡw#%fsJ>pO-ĀIxMt`Đ}TS!)~,To~hw̏?'ׅz}x8A]U~{TZP뷻s9":Z h/T۬mq۽bӺo-ք+' cɭ()2@la ,^LA=?e8K!,ffucaSΦA _2\9AW$,ΠvÌ _<\zaJfp]o&/^,(sFLE"NY S<<;5c  [Y{oo:f[iV7SoWj~u;ɮC_slPz擧xɒE٫(嫝o m!%_u`*4)V8)S+nw9x5J=Yn}/䨭-;eP# Rw ? 0axewbIS7>ipRpZJg}t;?*ѦX5H WÆ}b ! I)Ǯ^',}݃R9DuSgAs{x"4MɃ!O` m&WVi(~Nr_@`z@`]*;ߍHUžG1buʋݱ.y:ܢWVq5)_*eψ:bsUQρ5dk=0# %96Nρp&vqk=a 61a&iMp8=n5ki-0mR."u˅ᒚ@D%Bz=} UbYp0;z\,ahU$9> aޕ5q$鿂ˌcT#$Ύv0KԱjlW5@!D*ˬ̯(DL𭊔Qu:* .UGtb168R1|$#ŤEΙ,N1 9VN謆ʂzѧ:1Y=Ё`2>N3. zSӲSfA_k5+]\\Yvz9#Β?bɘ>p9⬼s}Y9-"ΪJ>}oFݣ~sBgPL`K3*9#zK8G=|^뽄P0{nok q/~% z#m\)B)*y=<*гZ2A\WkxzKb=a{aBzmݐ2*j^mYd>L1hy`׆|2*k4eU&Y~k,S1r^#ʎErĚ #E+{JɎ, lcu%dz/7 }r-(iջW=Dcvhօ9k04k&_=!Dw4{lӝ+pMA11!pk%vfzNVt5V1 _]d8#DUM]ߏĢu赙C1`,`-[80|6sq>|_o9z(.g+nȆ џ烷)ZOx tn`Lu;:|l˟MSk(zsfy`1:<ڝoR p]cIX_ o$:xCݔÛH_qq!X,H@S-wRGc)%'$x+Gx8_0/P-ԎIN#wbp,wDu.cy@O^fV7)=ٟu?[B0X)쬠ASaQ85!C!8Z7:z5qKaA" 7GRsNKٕƒ 4ZaI9b4g g-c 5kDIRV_ Ԣ̆H=YI),iw7LS7E3LSpzsu >f}n<y~"6?Y BBqfů`3 r-٥OCeyRL7o(zNf *G70KMta=qE%U׌peS ~nn)9wAw;/P溶[nMX 7ѣm"(qs$^3RGuDo7$*纸ۻ^h=SJe̓mib~:Jv2ip]jH~0fl6}r KȿEBGj)t)QW~] @ײ>(g[q2.tɮN*5•T-6yD$P,DWcf!lcrQCWAN*l}č$eH<@&NHCBmƝEA < 6ldP`Z[}q+%45 PiyqS{ 7Z[6eO4ݹ^vH>fac;엁^+o_9&Dˎhƾ, A/S''7\@)IJ76gyѳ} 3rb5Ȋ9^̦P+n*^zȠhYc] -<(LҨeo`@Z0K x=uTX*$=qF"ln8(Kû`n:MruHKUREi]Hr⠅/_kI-ˈcey sKw8o|.L29˲)Qexo1Rx4 &B[hBUTẒמ]WZyu;Ϸۇvrž/I7[>ˡrRa9~Kt:X(kll >Bcz G$p#(DE}(̞֮UZQN;0̉r`%^?$ LEʆwwPN yH9jҟk25;G{+א] [K[v](\IX:vObbӶh8ie/~Ra#Ax9 qY"\0: }|ffבԉ.E*ԁoS{L{q˚wwI[2Q%[Ej8?3ʱ>j/ JI$K| (~؂sK?zOӡT1zt]=K? Zr%Xʔ]!OĖE"52HoDQ9o0b_?x:/awwubQwl{;;7KdM&|pl v3 KWEOm`֨!8_3*":T~ W~H2Eʗ)RLe9R.'A<%R} Q`2k&Dka^Ӊd;jEG12&Mv_ݎj%}L;g='a?0rM9D ν idJ:!V@3L lI3C)1aTOqk{X':qoBSaah4>rO!k8D&+#0Q  kE暶6KKHw,*, - Xc+\3ΰ˜hV-b%4!V#('k3ipiMjBJLFUIJ%ްƾ'[^ܿ80at3IrB oUB1~d #L!J!PǤ+-LYpeiOQ((4`dׂtTHӪ (XGLx i2[N}WqG ( G }{ &K㇮(mB'X`2z^` !_#vֱhUc*5SF"^9qe9~2vӗ'>CGme_Xh0[7nB.]B Q]5kPe{zi+hfq=Cm ihsؓhz"C*Br/_ʳ?x vyJ$1?|ʢAAWLy(tB-d]wWW'`.ΨPD'4W5H QsQ0Oҗ2߉{_S66\,q:i<F槱_lƂؔ7bx6+X_<[I,n~mv>_L,Jeث/2nҜ,:\Lۀkx]rs zJB1FIUo},4橡9~+]UmtUVlai[MdLR x3h:q;yYܵsBkFt0#+'A(gkvXVGIP'#罂ÅrY%i&0)]%FŚ u`:1+XH-} ՐL":06oت|(GAD"*4˩ N9/4)ւPZrBʗ'[q 8Up3NiX*a$0)N6Q!ׇ{5<𦥈Hv1/yv<4a}jF읞)yթPNS7O 1 La>z 9;e{)sx4̯G4E}X o,ʛmeI{[[;b<+>\m ,uWnwUC`YE\ʫkAqw9V9;b4SAZW6:PGWINuks$/m5ӥթBeTS[\[dZT"D xsmHе$-ߪTIFw OTc{ULIOw U)v[ÄJ4jWJu .f'Rp:W?G+ݹ1(h,eە (Wt A fF po8`jMASuJ?2@0{HU0&T`3ڢlF+kn8dDA1)6¶k F"l^߷Fu7hs//A5Zi)(tXx Hޑ =XK@i;8ůѵ1Iiy@&&-oXw$3Nnpu@(p6NIQ[Aga(j6k$dsuj.Em hM1bû [)9SFv?4 Եw+\fz6,+7"bNԘݚ?nӆb+ir-͖ZS1!Чyph;4-A]ys+7|ڔoEā D&_rWCO %nPO ` )n݂]mHOk9R䖥pOk-HǓdc4`ipp >} eQW;XԙxGZg:9*PC@}/F{~+fc=\ 8pùl31gc3*|؀8wo<>wtX*<wJ;\`{]cuX uG,}!&"Oý8 /BI.c9mw0:fN3S ) g{g{t>* "G!)„p>pV!i5E aX7Fwgbri:ZwgsjZnGa:q)#[F.2'0FRX lqfHO@BFq*3m6k >̈́HM*sNF|ic,Xh4`y-s:gמa2b@y#qLg- Ho`H(ҔO@hJ^{6TtYc xn/Znds&ӜۯRUn~5ALPD^M!R@P9!E0k GbjvIOM\A{ \l!VIc%a,)8"+(#6 zy0@3(Kk=fYzxoqy=?Xe6>/>;?ab¿ӭ 99dD5vhJۘI;.>I#e4X(41 ɮmtγ[p^cXӿxT=לڿsf=~AK@-YWnE6w:ލ%ԏJ116x^Nk[qGs[hM!dg.ŠքGk'O@dlus$o!@U$l*!00Qp? PȜW mcSmS e7yY؛;t}0z< ʿ;RȎ6^:aCvGw0haj1B:r<9˾2O}{`*MWAf#v-Z7]Zriѱt?z7:1F#pzqzӢEf(:J ̔^,e2$:cS3"I01Vx:Zu Ӛ1h ppPj8ZRYam}Xj0fLiCE5cJ?Qq^KqoI~91'sF@=`9I,]0b*% {ƔK F_#xc:X+CLeE)R O8l$\SY8{a3^Al\Fގ͔q辤blBvkcb1>9H&RSF E@a hlxPqϸ%J7]a]aW]ytA86/,p 6NކϔY11 VIT k <1p{#Hݕx 7#,1k@a)"Xq$Yc㶗SXXT#c' I}G לCbo~AKxo58@akj oԜI-o6gRzx޼ϠPk9 8=5w3;նTj[mڻ:Q{gObE݋35/bF<4PN\$)8"30'gc#= ~< X'Q .FK,'kgŽ;s4=JЊmkϫ8^ЗY^~ 59i4^S.;s#1X3aAAKnx;{cR vg<3+=JMNWk3uW=a}si{9GsWyLh/&,#V֞zhW c fi6teB%<Վ6߶c4IV%an4YI9142n~[i~ZV_r֒!Sɇ5 */*+䝚hbAN9il;]. 㬜l G6cgcO7NNo~~HE (_ C9@G=6ƕ3]J9N0 ;'k/|x>>|aZN'./!@_;93Lxitńx; FxÐ+ܗMF5gІuЖ HtfЦRPrD)`$9$A-ˋ^rO:Ԛ@irujh2L9n醻ij0 x ` ,6a$V. Ex0@3gaqAU  ̩},6vF641|\i33! f߷^8_NL_p;bM3atc{<#l`ȑ2LB8>{ex;ch솊};mF=Qū85Ә^'UƧ'_cʝV=W8p3pjX]I=Թ+tKZ)նA`hK1qh/3F*ζV6:C{סF٠!L/ q>?o7SÜ{Vu$籑|H$V7΢Ef3n[ y,SkaY^K֕v5?0ZSn)0+Ns+]|} o߿sp~\S⺛*UrQsPyۭ!67Yy"4j@G@L}d$QsC<ކ}F5 R}Je7URQX72)/L;)'ݗ:?ϬWr߳5xsg:)߇-[L-VլAڴ4,jT!._=J+=I:Uf&Ed IN#Dm_Ûװ^?uQ(ַ RT޾pV`s%zzqQgETٙG| 6o[cRZAoC9jTpVKoGHG[@fJ\t* Fj\^@* iT#?L\MpO("@BHKDlVr[(a Erdir/5{ftwyBy;O}%zEI$U4e^JW鼯Û`(N!*pSS?|'^(iHŔI[p#P.򭃻"`6 =}8"VFQv}_v_f9o.߳>y28S'i>8S&0yϋQxo'ɿ|>?=6gL, 膽( &x92˃;a7PtGmyZ ߐqA䂈ï(wAdx?9(`uυburz Ȏ\kL^mMTYL/``^)AQa2 ;cchdH .E(lGGCz t⭻CO(1SLquT><+1@~h:Tbaacl ư4Chk(Cuky t 2"@|N$b V13p)6*ZF{߫M9LMWxx =ۤ9"R E.H=)U7nL7n1{@6Xsd$ΌB&ňxJY(bu uyF nHi›M;OsN=wWT(||Wb;:˖wBb.0&zzɣzh4Z(h\HDC0Np|yI\Sup$ް,rì@sf 3F%t1֫7RP,jkZ}m~ =.i)Rr6tA?RA +aQ31XkʶFƱ,b@m(IqofG:3LN:zMQ);8]&;RAB$brIY ]MkBJmk~?.1DQ|'TǣlI)!PyQ#c)Em'=&B=%TqX eN1K$d3H۷Qi L]# G zӤ#jy#v3on'vnO/2].t G#\AFj O;s^2 UQ?P` 0 UR.=6zP)tnam2t*]]>:FKFFնQy7u/' o7*~?}_?(re2>]ΖXge8Wz?cXRG_sJZ'XVMfHRHwH<9')Ν1^Rىx/ uy ӻ=ƻ+29_TWIƵǾvّ2n/{YTa*`\?f`96; Lr 2 5ˑT|\~@7ۚK:a8"+ݎt(i^ӌ#t1z=4u)sY9 3Hs߇wD LZ4MoyB!j:\IXPb_ߢ&ܻ)!ՃPqux͔0&J wF<Ơ*̐F쾕V=g[ \s Y%QUSLÝh:rd;z@&Ї9e)&TꈧvgIu><^POԚP+/XGn O<jB&ae."FրꝶBД'щՃo<(>%4a'eK:)K#N; ,qL377b:hsN,1s$^X,sbV0Sd3ISӼ1ࡃ|0Ox ;:0û67w0v)|Y9çOK%0,ۙyk6+$iYf隇Nzɇ+ߋeBW1K Zvrh Χ5* W9 $Cy%˯]-r)C1PyǛv7U€JڵS P1HY?/.%@ϝ] Pɚê8u׷Wg($ANUG*jWTs]u85=VLo( 6H[F{ %f٫*X,d"@ $&vKwjS(R4:٠6W[L4)(Q_"ʜb#`c 끠"t5臩ŷ X\>1ˆ"?#t,9Ѥ4,(4I mn]EtvZ B:7 :D1(ͺ*2Fr.p0Y@GO XSqQY !XŤ W(xZTUs'`"@ĝ+ XIE+P!&MJW0@;b, 0b&+Ek!Y"]m\f5栲X$AWNy' ;#5]1ls:Feul7Pҽ; VB׽M9^9C>;Q Ah(U/-I,a%40IyiHb PbeWm)YQwVhOB&*Z!ItFZ$*kJ52vMi'YoQCnhj;ؑ%RTv 8s! ";-6:m `GWZj"xՐr%d]V. B'?%'9 5 UoPԖ-]rbV$M!j5}tZiaBc4L9V@#IHO`2>/L-yozr$&v2?~H`rC̎b'oܛثmdV'fŶ-_|19Q =x}avPtgɟNa~b[Cx%Zga6aa]v֜5f/it:MEEt*Pd|Lq!eGg+K̘c)rOZr/Zrppiop_ vN]Oa[- Ptpwe.qηfWPIz&»::I].؋ϗ>-XggonY+u0vl */E;d`~z0odm7Zy0:!3`;_c;cz< {J,]"0E;DSͭ2:+t`47V%r]5V\,;Ɂ'\TvcG9R#7x](<`Lq#&RBՐu&_I}jN%ZMVP&hBk%ͥDG?e!IRTF[(i ^ҋ+UCP9JP8TC(d3f &"DCLHetsAYYu }{mB r D-fxq"U mԤ|JW1(ƶ}_!H*֠!/NLDȖ?0#ѺJs V&n5َdMӲ;\ELC_>gr&PVLwi;=ȅ,"T58h2{(`[_`(-ZmNXBKsD)!ˇOVAA單Me+ XU֫TV:BJlRnDLehYC?M#"g+-~i[jLwx)r\AƖ=&n KdZO˯>%$17*qv`/u)*g\(UkHR_f@=Tb"pUJ+& Jk@'|b4 sD}Gۼ -cǏ .mƒudpIc7@yV#@R%5x@=MLlneU,ij,-(=Cd'mEN\=:/fa݆AX0\r:eCJ͕ZǙKG#gl:8-ؚ!'N-8LurT@I -WΎPDϤUu+JD+wE|B%\3}'OzƝ~_uFٮu6Եc-8w!Z?A4ϾU-?T[~AckՖI[gVN*N!9' ,${CZtrq~ O{iZ&E!,_}}r>fE8%. cΏgc J_1%? _,wՓ!_L^"MySgb\;;)Jo˃ćlW.qܛ?L"rќ]>>;/r\>({X,2ooG7Jz6Z?|^x^i˱h}] }FqyqVNwaoΛoxۯ|W^ˣQ|/%m_}rW;;~|+_/o~1ײ34Cs:~lbe< urMt#6-p##Z>Ύ61XQ wG[;Y~H=k2ovXVC|I_bOwO~blԵKEgl,4o-[x K+-кɿҿM1䧍L#y<<GxgTdm~*<"Gdl#fY^#&Z-s aa`(D~ss;Om+γ:ٚB,9VJt2ւ10T]Yy6τk ͎\gzU(eb ih=sQbIjЊČVcJ6_p2q2%(}(qly-%ٔRi1H }D>hQP:>~;0J} N(764"Xl!VDxc!_x$vv7M?T7o^mlwI]!*Q7J5Ez><Z |o9 @wQ?=kim-F+Y` ovRP@tƟ iEڔ.(rgS 0|K>R9#H^X WcROE\S1-ʕ ]TWyS,QRD=AC=BWzn)DczP-4ovk7(g)n֠Iũ)~\g]۷!G7;xQOk)C^+Ogy*/9rY)OQIhr|&?pXbo`4 7 Im'f*519vL7bX9S[-49K%x|~AMBG5LŃ " qܲ*tjesȪ_~0z$O.Z[\6h&]cɦƀ8p\DRLmrXЃ쬦ɖad\I(0_nTi]i{ey2Z]Wlu&{Mj-nY.iuyQGֺH/w "+YS\8_v#&?eA8`\AXC?-[~\^^F2h(T[e,8G6nV$Sܓ` RQ(г˱66Hy?NDd~Vq˾"Ezj ~jmA/Na@[`Lt  =[~?CؒzqPtcǵH ^P$ IVCPR8$jmZdšU<`oqep{çԍvs Rs{8dn>{a,<'6E7O:߭ם7\;~;L!zu~n _dY>Y:] YͦeZٻeÓн}[o_ln}{fɄN_W=?g{e8اStt +$ * gL:.0tm>wڮ=ɍ$`+R*;W@i/|"k={}~h82'*=OBy[ 0'<5ToɆU:?Ũ_JwA._MF?} c~Ct=U'+)I0_}3#,WgyZE7e\͉T*fOgx+Է&; R~W 5/޾7HwڃנGS58;~Ofp64f7O3 [?ae?(oB;oBe]l_Ay3ꠤl/Qþ [Wݬ=[ֽ:8߀~?{N9a[Oo;-ڧw{"! ŗuL4A>_8KT'jZz`ls9ycx#erv>oƍfx|8Ã^߼/%2yhP?i/>_ +`UOåǓcs2e 61=])D9N}qko:LݲU֖j>|nx]JB[u:{`|z_`iiT`Z> Kžqx6׳Wgާz_wl^ڝb`w}f7n?Wz7 #Oϡs1 Wbv[@lAo//:WJys؛4Y\r!,V4`W֮SEOcns~bU+f7םbUgs3Q0%g{H?WN$+L+lӾoc:TV3fpn`(a}=+bɩlp3쿒O/O'cu1 )_`RuI g6vWcA!:jk9UTGkcʘ̣Σ+Mw/@=j-vH1~5q;? b3%Gv'M,k͌@i>d6T 5U{ٔPnv\}Вun5/m( ;emR,h4W" 3=TYDJ"E$1j#&Qo`-s,hUD#Q* @WjaGGE1 t2Q<(.`LC`ޥT:e(:,%@2J#p#ez-m }oTRQr4M0,PD[,%c!Js## ; i-X5cbFdw!H5(ҶQ*bл Ki9^^ :iAoSH-R aCf|&a\sݥHwA%Rk. 0A7Z5F]߾Q\/7X [zZ%5]'k>l֠Q$.n\V1Ã;N,!gmwϝ(NhZH3ES8˓[3-De< f͊(nL0G0=7^}qJ3MQ/ًnw7םvj[Ym_iZMHeki]x!Y8QËa\ST]Q4e^N;~[\^'fA$h0fi۶+\m6ӻ[quI:m]dI%''HJeI$b:6xxx_x K 6.v V: E587RD|G8b\8+u8 0A5118'Uئk].pr 331Z\ڣJq慌S`LIwf8s/LFgcEލ6s/5݂Uhvq@U/}`eN 5{t+ Y&uŘ}ri%6I+srFtG 7F67U'B]Det.C]sh7Efc1Ol5AR ˎSY+EE%HJN+Gy"G &*F*@9[?=TW8E?tXH\0]b/1bˬΨi7 zO8Ġk"E$HQB_e9ކ,ro"Ԭ%U6@Sip9j_T4+s,[[iu\)$1J27@"#5 Cs_m-\~In%2TljkeHv%wZXMNk1a;[6JTy`.u/VCr&#јaF"WY[v Vh*yPHX䬚8!Y5 h܅܏"Lu#cH>RX>wB"b71V-uis%ԿFnr8!Y" ̫KZ2sD ^T/7l}?mܰuF"MHXqܰt)۫ʟ2UUo͛V+H(W-ֵt4BFOZ+ta YA]ؚySk$oGê7W !Tɧ@b|s#?r͕]iU|WֲOjȆ( @\jK\Qq ٴee D#4Ǒt$I$ [iňOUs$`ܲ_/m@RL%}JP >V 0c"TyL)$KM(T' Gr/%ΑU&ظ%:+]- o!%PJi)47nܹDjWRw{vR&ًQk™TMubVcC6 n`ލ9vڎ4Ma1AkCoV|Z&t74no{Uۦ;yg;ͨk(f~{Oao4w'>a:g|ۘU2ƘH# #M}Z )58(i޲o{vZnmX$y\c?M6&'(O뛃֙!@[4L&sfQgxв3 LJG3ߦ{{,v.ܦQk;kbZNgG-肎2y b@&xWҧ_޳Tگ dzhg2wƳQ<"DH=!ȕ #:F B1EtY.DN$ ]LJeDH)%dNX_XHT4ЫHLJOثn)WOwyX8~\$,hިv_ +3̾~΃j:Wٟ~m˱-)ɤU*,H aB Ԯ&VolUB}JFJa0si }\χ CcpL:+򣄌I/lguK푤f]Di("AQt#.)u[Uy.1C?A܏W/Hω/zd,wIZKfb)u.-=J=1wf aHKED<ĵܙ c7HSh=J ( MhlBe"7_Q?6й(-̭l0"cZa֩czзGoA o#}dv6C^ݬך0/qI!pG )#)m h2,MwRCm}׎8SXo~9˳7Ǘ>=8hYٻՠT*][r /hx[\~VK=36t ƒog_w] 'ҝmZjRހVq9<[uLuD&_ Mwgo/._^~x~|x *wbS4Qٹ%l:7sftc6}v=s1u9vP^YO;~V3m5Q {:kdtu%0]O/wu촎Ύ/3qkGlna$:#yof,Cd?^vqr6><8>M}/1"qDv8{9@9΋bH<5&x3H)Li)o/^z}7{:]>_^8/Rpxr;{ϝӷ7ݿ9ïLku$d }AイvŵIwɄTW=-[3Xk9NNaC`n}@kmџNbbȥo{ q:CO61v+r[{ǵp`ItLb v Xd [_RZ߃0vu#6N"\uppuF ^;@/'}qe{ o?}Ob.~+iĦ6Dvg̤ރGGv<éovC `n`L'sF$Η1<} ӝd_'o܀ߟow'61GݾHO^qR[n r PF򎍻v;w$-mBz|'-ƃ^__%ǟ(g& Ѵfг F3 g& ˘xiSI@.c,`5}uҎLptOMFzI'۽;[h"u:z``&6;+0r3 k_N,6ڷOahWKz=}+@nоmw l)O甉Q009=rŏ} z⡹I:VdIG? z~P$TI>`ZE 7ȿs>bG./D$B>eY6kp3Oyx{Wp^N! ]N  J}Nst2b*X3=Gvf00u~ z,DG3H3CئJ1X8vFE3jsDZꨅVZB?|0Z/ZDۑ}ܽ t^+e-S]cC]cS|Oq91.ح6 w v5~M)~r;ubcG?|tOQ)^`G-1T{kqtH.{(iG Uv v#ZZ4]c\ۭ -^;qkY/R2KՇ,5q-U$ʕ'ൡR 1a ^y"U'm]ƶͿuZ!(gvö|QP&,Njcr rW؏MO?j5p\xZb\?_EhV+O)%曪e`_7G)6o~M?^PKuzV[5f?ugm6riߝya֡Vh)4bUxiea |`ᆰgg7oǹW $JGIFrD!lR@21 ͝&ʧUg<΢@ǩϭc$o)ul;-'3̈́6Jo.d󝣑ӷwݨGoN^_^-?.[/4"H"I9/@C q1ɢ@<8},10 fl\yLKQKȢQiS6?BO>5ݠ KCLu9֙O 5{.JgxJ%laV+M.Y˜Q#QJsD1KfJ:JgId?Y?Jz(6Ǡ|L̃E$KIY\+4)mUG֣n9I)k1O ݭMV]&qh/b[x(D\{ʘHoKڀ9:K1ئ~>yM{ pF-Ȫ ҌVkj pv4=V"<H[+*e`L:2Q YuPd!}Bi37mpx"Vpqa#[!FvpiȐi< MGQЏo00 [mϛ#۫'ny {>qϷr3xb?Bd|lmx>'8[Zpl9 M9 re&3¶7M}4V\4.ɨ~8N5D!U e0'oC, d3g'-`tS#l yxRܰeݵ?AU6۲T{&>fB#< j/shTǨ,Wi7AoH#Cޓ[\4 I(8M5Q,ԯ h'::nfGNF-Gy =G"fW@. w cj{]fp &C@pscb9j 31,8d#uҥ|[ C> qUHLvx MFpӋzj<[糙{͙V}I___[rCpjMK&Y-qBPK;`D* y*]l3j;e!G;CeҋV}.GH۞q[ gBw-{zG Dz-uzӱ"mkQ^Fd=me5>7OӬO}ODfjBPM pZF>`wtl` aEv$/aK&e`@!G}nۃQYYݓ/[\t3*enc=IWLV+C`,0zv\IB+2?ďg ~ '"V9h?ۿ_)f =ߝ+Лeˆ!uqvixg<ڼe%V_9:5xsګ/,R(#y)^Ĭ C`)3֩,t`8K"?Y0^KU霏EjWq 6F=穀VJzq#4&DEG&QSG4;K(}" Gx¢v;-U6cM:l$O{gR" 19 ,i+/X!'EN&Y#֨}XBڝ?  &36L/B#ɁsZr%l1`֌>VVi Aݳf+m0>' "WBWX9 1D5mX=+3ٱerV g@q%uCäf)3n0Er_ &/EWo$3=F Zi7흥}T'+*&Փ;{a6y hY3%>:x,f),b%d'?a T|hDٻ6$rԏ/Kr`A͗u bLQZJ!E_C҈ yt=W Z# a'. ÓHB j$%E) 7DōnqR6*>yFtLʡt@vq8? ۗjs%qҐGz4iZ.2^ F,י3ɞSCRX_ҧo6rk0F5 (Y_aji|RoVFdvXl;AB0@c(n"jq-\}?~2 H4ZUM>4¥wJqR9-[S0ӬhU8p8pJP!(=YIAȁOҩ`CC0фWnr]i)"wp#9SR&Tr,߲4 >N ssv70ɠt28n[D8vPDmHh;vBX)b[g^pA:,N TNmohq:7M[֨hd8DH}ƙd&0z'aHd9(J4;Hex1,5B21(.h#ߚj)I"a89W D./&%ʑwQ vLƜ~$!]O1ZqC6kD"(@ Dnfr2,~_x:8QtIG}"akRx_f_,=i0Lbu-i1dG $]U$虉j p+5[e|r10)i KpƦN,0\38wX+ZhŶ}E .FGd0STLLb)sP:3J&4BK0T4ەgOY}r+wWT5/&OOz-(}w];X1^UW}5lz gǔ  F} M3g#ӻ#lu1ݴ1O(EyUzlԱIAV,@0<7]BPR7[a+.rÃ8Ix*LDS] ' P m8.i")y|A F+URpQg_:KJNA"l;qɊ-{>ރDڦ ʁ&ft̘_ mno0]-kM;||?;,h<݆!^/ڃ`RtnyL}gPjGZtS_Aގ[{%\Fnիy-Rhӷj/pS,H"htwA L\MFN,Ɨ.lsŚ%aJ]v;CaL)bTo-2ۡYvԖ$/`'y}*'%Ax600rƞG%ח9Ϩ/G%;KNrW&NLzeX$Bze΄(DŽo6 j q=T#.]Y☡mV#kW#̗Ru(K>@MƎg?͘';=lQocGd/oD ]=!\e@Gt-['Wsj R3pՁ[12C3hFC )J\'jtRVO[՜i BS(e2Pk\/wI^V*MEIcZ^x+`kЛouĦ)Paloc ؆Ul3:؀њTi5ёg1iЮ׆24d" 5S4:^ TYּ^<8Oy:OsKgOj!a9z-ᐛϓY?lMuO>;}W8?犕ٖ`hVؕ߬+__-_t}r3M~xxq0^LeL[$-Fv_IByw]n r{Y rLr&3+!pSOB6bZ]wʃ6x%(Sހۻu&w+!p=^0JT\h0G.,%=_{"Y -zkPFc)n,6S&rS~)*8u4`5pyislp:3qz|>=Axw'SrèU5OW_qZ+GE80}{% _~pb `ڽ1^uK%HM7Ak끑/=[YWs]m͖X޼\p}Rd|D,U2xUNYUQi!* aNͲ= IgvXHc8E`βU Q}x>.k[2oqv> 0LPIP!Yv-PxJ(cW޸dBը+"y6O(A<# fhQsU\5bb!'=j %h7Qt %7;Ff!3%,+d6ZG]`{}'!^Ai2!AUR*N30GmPxN4|D ySfq Ibp_'JϥF LcVڐWzJU0DȢ"6O(@ψ Gg 7R~b08hWE502[`43g^" ՟q!>#,ԛts)-e@ד$mv,t21?Ʒ r&UFpɄZ*ɤ8{̎{IR,$,0èQZZBh#:;4;kY-n.cneoz%acv0DC5,k*1ĵA3\62aWd\71ߎl^z&5|w]/{d"o&CY+to*̩4$qN;E,.RIO@7 r=!.7VEm O5!p rCIPb 1L&2pSXE GG| )zY7O`t1R%}C.u<ԣ^ֈQ8J,pFb׊F&wFbADFK0*H:wbEzw{e a+<wPZ'$<ǡC޵5m#뿢ԆTeԩ}T.8 prB6Wt}]t+į9B \3ek V׾@,X\FKƌQQ3NJ7RiYgY#jc1rO.g E=nC',߰ɧ$Tt0ڷ |x|~fwv{q]9W't9y;뛝gdZĝ^؇[Vm(iׇ"d;p(V*+fsY KYA,LE d ~WsպٞoJƛ|a onز0jjiVot'uJO|ɾR _}Æ./{Hgonz!n9$;QM}*UWcV CG8ĺ&&EYcG*b$eRbTYV@Z9Z9v \ k'2 s#0 A5Q_BcVƅwa\xqt,G/XyF߭Ǣ`4vo /m(4yÈBqߐDxqpY"Z**EU*9ًU5Хi e]TTHWβ2inob 8 $C#qWq(Rk]Hu)Je1к,Lw Ps۬Ĝ f~W7`q/#b6je.e6 e@sABPRXEs\+P$*S1 dk (_j&y9*d|7Wd !tQiـ-,lp a`*HHd \(}{0ݝ(֪n 0Yf5֕,d5VxC3%qE Ǽ[XСF@_ /{zĸ!85T#-B{Ch@6|(4 ~yR_2Wdy#^06+6_yHx\=!s4 >rhFR;[vLKο!WE;р+spI2 mW@ 1%~~tf}*hKyA~d])]ȰP]Ƞk8~f͟n_u,cO0ry0e#Ony~ yt<NRuIJߓ-hpN 0KBBT0XtſA { ΚT6Z9D"aN'َNNiq)Dn\lEhF\ƅ.Ljզ8}Te@fNG`nG'1E )?#ɕϘ*LQ- ߆-.zk^ x=Dko1 n?gG4=2AC!-D(|LntbYf-) 9rB9aȾ-ȷLGp}<%<]Ji BYC1_P0E$=C۬e"4wd \I TGXPށizS3lŦ1]Gv wf~f8NH1:8I3cq.sE]=Fl g NBB`H"-D]:~UK|лXLHDD~@%:0,)kv6:]wuv#A!ϰ!̨uz~b㴑J^H^{dd%_+5%n j ;&s'0!y#v1.q &  oG'W?#}0JYL[3SB"VfCMQ.\})QPI5uq}.TۛCno<*p x_+(%<ؚn5!O6 >[ݡ5!єָ0Q 5B:Q)WtPQ+`fE!+;jp?$>=g4/?s9"ͱOES]A}a2Zt}8d*X Y4 ;UNE]^xkAxw#kșȩgRn@t-*ҙ eO_uP}hKs0*6 Y󦖙$K_PTadZgd( 7,Wڅ`ܘ7khe.7գ#ZhWzڮ%?L:aѴK5D " 8DިK# imTpMrmrls2kdG(*RbĮ qNx |shtE;dTlTv֭wy=~/gݜ ,d  ­Enq{аV~ ԇ{ noK I06/*eUjQ4,/ OI*5 =7s3  !wk vx7DDhq ̻O?~5̩|%Hi:X8#^;Y._ξmD?=>E|ZK&ߌ >mEa{ x L3Od :aRu6igH}!؀y |~u0N$83j7IFr@a0g& D yc9:8;)24y2I7$O&3;\D,F$3Cf;+hpn2, {x;;%QlwaBn;ɱ B~QeGH GeQC#OiK=%̟ {[}z*=<9_ ~n]޻ŗ[z{t55 MqxHȨFhw O[X(&0[XI8>g >9);k /JC 'di\ZJIuERK3SEtIBgH-O0i\5Ԛ$(9TX)jG P$Ч7|T(\{DI@=0ݘbf=yp?[zOl5"~3scI?5 /[UWY ށIgy9 _; G0?'>ܭo,@a=<_h#kjd,Z N-L`$9WVsIte 2́8)ٍ_i[|x 7?,~3\3 "E`Q/RLKF%ԍ[ 1 w[2|?~|towI8&7g 3Xl3 )tX)A^iOL5 XrF8O5}ĹԜ6d9mI–#-ߐ !qNM+w7'DlYW&@!dO"r!%j+]leMY[Gx]U Wb#)EW Wނ~+{JD/bBR!bZR(605M$-K" a$x8QAK,CH='3fpjM峾k| IQƸ,hQ^, >DǹDž /A1/T:_1zn=͸H2UK8QQPxM,aTʬ~.ńt.E %7i,ڧ0әċe93ꀅ 9=4{jߩh G#aSGEPZ %هWE=5 iji}7y7zc?Vfaˇ ,fۆqFogfpN~uy_GV ('eu grChݥ'OF&eN?!5JH‚>$41k& s!J!)J'!Wr䳓'g|AjiS߻j"5y3[ "yka5Httac%LR!Y0q27pDg}|sM~1֬怺_^ʔLj -wR0WX4v< {Ǿ^kRaÇr_jؚ{-ܳVhкuU܃cqwۇպx4_0%8o[lAE(ZDYO9aÇT4UMW5͖Ii3vlĚ'53I qzi|zr㈪p q;ޑ5xYA|/;cOD =0c=#-.cU<}8S8;gL.p>"͚I1c#}]q x A%J\ƩN]W#hw R W$<)}peGG9VZ SkzťUzE9GBzp}bARrCEmℝ1%C^qxly];NΗJٺB\|C-^qR qǪk-]GDw5t`&%n].*'C Y}y<>hpdSVZ_T5NYٌ(rEp#x?}Ƿ=t,T7)*,΃*"lTv7mTg3pyI$<BrNC@ LE*]{fyd9\{msƚ5lĉ.֧rCtĀ:w /+Z4ָ@2_7?|՛(~}G]2 Uګ*}K=<|ǯ5b/7G9,I=ǯ9j̩e DH1 -*0Qx"&GG! wKKhXe+um[#񯯯|4J,p"pFwFb8oB[9(Ω7aRROQXM'Uwz~M611nT2B\yvRK?WX*eqz9X9xd5H= ʙ#>EŃ%A ʋW1Tsϝw J܉[)O^ésOQz ˟FkP"*G=HFϑl%^D11ʘU@9R92oy>99%Xl! Յ}ޠORp\dr ۳uvemVnUq{7?;DG֢~X( l'5nLj]6|g9OcbL톳Ƃ~2sMD (HiHx:us>AHe1Iz#Z;@4Z|޳ĥn039컙Zc+"@PhV?8S'9!9Jv~%5M1ܠ^)>QC"(Mkfj"[7Wr~; Y_ۛgb>' ze4~x46tl0/'7Ӛ_ƿnk1'x(]H5BVUcԐVɚ:E`d7ie C8O,% CmPxԬV%Pxa\@.&IO{G؁C`VEN@pV3ȠXfYy9e8p&jJnd)) 7SI[ic,hMHNf9Kb \K^u7zt1’DR$gtɃM&]4ޥEFƮ['͟(+qdΛ sxFP ܷGq(}}LF6Gy#r,Y,dѼ&m#֮O|lT=Y-~hU>q c@Lz~eTzw^Nrdl3S8Z&o*4|#Y$TҜ19[ тR2WSvxQ i6Gx+Fgf? GG"k^Dq0dpb.K0뽹Od$" 7Xn$SI;s2T$긓xuZzoEԸfw[/ EQS1E7?eE%iFQI^%So/hѵ`H%KׂSH(9ʐ$Тd )XDԍϕf`5 N37:ѸfKtiSeOHiP6xí[1Rtr0iiTKKw{ىʨKL ;e|2N[Ђ=/vK{Fζv#@#1K)9MR6F4AZ ` %U]24y.o$\)R nX$r]ϡe=Uyw|n<6swV^66..]n\)| {D2֛=6:21Ǹe &UR,A6'+>&VMm:7ݥm:n:%c#p6,t@*Tߟ=8dR3*Ar_hLe%TЏkjVjv8`O*'4 QOfD^'#m-߭vYHNFN8~H#km?t}ǃ'x2MkOqoB6d&VxJ}?0fnpv}Fք]ӯ~P^|] m=w]Vuw-7WCYO-9wFohȝQuASۀ4}̉|0t+!,ZNvޢ:7Y]yC\~aG` ClS$RKUFXHh+QТ2BGW=WaM8fJHd*U5^=R{|1kX̻߈0/i%~mdc$]WZ]*#Z ("P]i/(r]t SvUL>;~\;s1{Qɲ`O~J 3S-?!ug XOYGpo\z\`-bv\JZ(Q\R$ƈ! ً8V] ǨG4mXKk4U8Ҡ@z~\=e.1c ̶:O_k<Ƙ@|;Tͮ/xc:t{uEAz1Ëu?ΧSUte;;Ui ,yRNsD,u}1ÍaHhSA@H(T!|D4yzG~'-]kۖVDZT|>s&~}ѫ!b6[y]NvzԆ1ʘ0) }BهNCswM*Iˬ5SVc^Oix^F6Ȏ zut!yHAF˃d's̮ P!ĭ[n#p3s>JJ0!dK -w`lLn*o\a/@8MH8}zx1{YIX08ģJl$8I9Jcw^t-}1nQQFh }OOjh<*ENB@F64·pk}')ۣ֟Jg01F%Ϩoʼ5*h`!騖}P<(w(Cّ5qrݷ?7`N@Lg+:[e: q^,sd)"E:Ė@oYTOtR@1yWsw fVg1)ͮ#2V՞ Bow 8OMmwL?6z޿Ms*_Ϳ MnugCYhV]4|z} i8B|ߗ{|W@2𓩺{;Ǐw,˛򉾬g ⩅rf&THW0i52i?dM@d:{8^pwׇY,ԵcWVN O>M'/^^={_L*Lb2[icB߼/_~pB94&aCkmƿveZw4)a^^X4H4!%d0Xq@KY3ElAMdDJ`y׳EMTe\&ˠM!d)#sr竭w~_u(%_##}QC.ѻY!AF wCy}q'ֽa>#ё$ݷt4q1vM6nxI- ~F6#`׎r` ĸ8=!8Fx9s䒪^Qf7PgMцԆ%~Z-؏C` K EIqIK䖲&F=WG))?--q/f"8D8_qdM8'wFxpO'bR$~ @xT:DYbPKuIP4?ɚ.)8T-doK 97%B1#ZR(䀓n@@FY/0K `+Y3w%EG'sl%dˆDƼU)8# {;&}Uxĉ 0Q(G&G'*| J"G]n7uѪtvE#?u 5Lie2A@Azs1AARL2=}if>m4B+9+if@|й*6q6 yIȷ\9|UqW[tԺ""rGgbdgo}pB+#[}˧awq&Xt`2-`(*S4&51>L C>z'?2K p!kݎp4;F##=!mۋ^()瞚g(!8dyϷxA/{nH )uC !,T6FJ) cb&~󏞶xY :R??R@MJT605TvP=CK&LWeiW`& 4:Gg N vGDJ,L_u6R٤4~2Zd9_'j23ex}ȥCA׳[?.,ɒP7[~nVo4?r W{ŋOtŷn-~~{?`Rh]s-*u) @*+!KDD*EY2k(Mlem"АYy;l=KA;#qKuJjrɕH ˚*$8nZ׸Vv$F8>zŕmڼ!HxlKh R8IShDjQ*RdR(lC@*dD4 2$ ,0npm+40 1@ )Jka. ],#7Xdk.f1zo2^83`LFxdN6^P̉"‚f(1VR@uRQʠBZ IX TdINe52fv;QT}!Qt~tڞ'nr?e ?YY]_Ѻx)^:'^K7Lac&l΁ ! є;X&`&6k؍ߎPNv (.+hAPʮ1-?idJ( s1+18N]PeSx*"B6J(~a#ȟ5 `SҮJJkWeY4 3ȵcL*YlI UjѼqegƺqv/5݄y+7K_L:/Nڹ.n+9B<|9_~bEeSm㗥.)^<_6EVh+],],Gtںzd !k꼮Ciކz>Mr'1ǿ){m-U.VBs ) PcR ўOvfa۳W^]LX6e|3soW2٧_<)9lff4ʡ^hb|x.~[m94[ GeFC Sa*#;P{㌨;ͶCSo--nu )vl˦ 'sy-'}AJF$X]4P߯z_vټ GO&W+o۵<;K6 {Mw6H6f!;nN+h5:me z 7>9($yءSv-g|.(٩5X iK䥑h ( ͝d۷NrkW  @ 0N42(WAWRÃ!xB1Y@egթh-Swa!﹉5?1xP |L'.-N_ꔀl--&dS}炯"HMHLjl_ǐMJJ6_> $Ȝtjgb[8FXw +eח5UT¥{؛rj=SFCD<0Q Ć)EKTB@ǰ|>5D<$vk`DӂpJ AxYԶR(tޣ6y ؊^2QX ) dA Fwާǘ؝h=jy{:SCSb;<"х](e]Lr agO Q:/y1OEpWc;bg֖Fƨ!({WftcFtK^8RKVB ɞZ h|G5AB5adHSOgCM5Th '<f|s4IyI@"pB`Z`XBC7|j&װP`nEiUZcʸa8sSC#L2f0N*}#?E(u4DzޣQC sG%UU1)kQ(Y*Ռ3G>5D<4r5V?V&9+%5 %:sxSCSb"JUv)k բ*K.3!F~ZJkNRVB*b -91{bt#"w& hqު6(Dg}jyAGn~ԈvޣǜK)giêG!sR!`L*8^6=~@$ n Ѱ"Ql8 1B+MkĨ ֦ H9xV昏9^(>oBoGHNe{N/&dS'x7!Z7 nNh]ېv[pG}[ yM4ȦdOZ_jJ`_BU0BUN*Ҭ2TcƘfRZZh]J u1Vg8)O(>4.3ɀG J(z& IbX{nA6}wsWB116x!Y a>RzM!OBO¢'Ri? x}_Ys=AtPnyrnG~jKfn#6򄏑ݑERyeBBwV!X bL'uSQ7[z>-:wQB׷ mrL{ZCQ"=N~Uz_U_^io{OȫUw(Lkkn6cx$/Cc[ĩ-Hi /#gfCJ^f5Ƒ9eSamuYss aѡCΏx}q,)ZlĮ["zX~JS,N;N,(E:M͝9d$cΉO-K1&螒'Dg^O{LE''yyB~-qHK{Q(W]GH}■!d<=|j"ri{u9ϱJ" N _>\JcC Q*FbE`?/@P ojurf'uy_gu}6}4ؽ4(#-]/]iJ,2iQI$e>㐳%bQ ~FD;˹"Jrª_ZXHm!kHT5lm˙Tʫ eRr_Z*#+f5;5D{ {RnG SΧW5K+F}V$i1L؀%"- fw$%L-9p #KͰ$)K$nhϲ4,g[Ehk0 ̤Ad_ ܘW% Z6"ϱ*-9qOfo)L }L;h%, ia#, Qi@"$\XF♧1M/s^eqPM=!̂˲5h(m=a јUB2 X+A2AMiX805pE&=6;gD>'Ky2t符+Ll/Aown]MY㛝 L޴t}ޞDdeInuguc8DR PlJ!P0s0c|KZ&I2W/qlwIUs1+7,'~I%~4DŽu7 <9vx hL#g JK(V (̍S^*MDRR+0x~b8ہ#ev!:^ 3/u_F3s?9qR'K?_K̡]پ䤍^hXZTzmrw0㝓_5Wdv~s ~Sr<.Pr) P;pHTw[2(ƙ<7RIKI{ "|wj8B*mr8$}_>>u*ж;kܾvX "9ȃVo+Lkm'O'׶Mg߹&Աt&7%/Ka$̋Ӭv ]F}zخGtR gobD=&'K5xKwjAQdʃ6NR7^S)+IAo琐߹Ȕ(>H%j1;223Wza%c l&q0 ,2+ͬ`X_xLOB c/\ =gi x*tU $”|TUPUÓ)p1De?DWX#\{akb}W7?], “{Ѓ,ˋ Kz7.nrj/"> Z΃5_c`+(7@W&:a yq;/`Ӡ*Vsg㢩{f+D4J/n6~4T?8lj :W6oX};xƥL|9ۼ!ķj(xVu2nsķyCoׅmސN"4;Wu;RHFS^؀prag6o*vpG Pb:Z, .x(SYc+;1rwet퀏y%(«'`oM6oFsōQFz '1ov<%rGjC'tV_- 1w %~,ylKsŽ'og7]LK&Z 7{!C녗׼?ð cV5( oEj. Ip7օy"{BX`\ʺ=_OMջ;3)7I̿^3һ{;WrT? rYD< 7m_z$ˤwIUw,ON; ,G{kxX+N~nxXIe+zga4>Za`z.qEmoUE o_pQ?@=Y-r=Vu*cT>o3G+.vQ) jsgyz^r7Vq#Øu_0?R ü55Qzվ:UGwn ѣDDh"]I2y1悜 1”7ÒxeZZ~~-k*RYn]Y b6/޽e>nUڼ^vڭH횘^yz70!(o+Vgm5Kzrk%0;':VI_|2I n+=g' 9CϯzoVG96ǪP4VjOu/bKxU_:qćךcͿwYeB9z5H㷓;z$ xʒ*E}ZQPtׂ X lA=٠ (Us0 T#7wLS਍DZ-ۇC[0"Ea"+QyWݑ".׿]< A*+XPL7ye !<N˪o%+A pQƶ ֌)*6Zg6!Ot9A>jٻ[},6xH(JvPe@ RxWliTU= HuҠ]QBIEY=~p_<- zqۤ[ Y!T-9Q,=o3NUXݬ|TNp$6A$I[f^pv "^a쪤E ,p8UoJP\y|ҎڂF P ny-Z㧦*I6]  \ |Byo XQrqR{"y*U1 . Ϳ~=W>*ܿ#k(ҋϵs?';/ FH' Ϫ۽`SD"~)j,oFbf&9h MkNja{ɴ}4FYS,ƈ4mǁy25- ns4˜+< S ٤8)RCVxsQͼNNKQ`B)Y3s!wY屴4:Ϡ.P˜*RM=\GؙI֩r"5Xh(H` 0c h(©_FSDE nt˚l5@? j]̥{ Ngrƒ΁%E[6v/NT8[}^riN98ihԆtXr< ЬIcLUMXUh|QjyAK~:Y_ud1["‚Rʚ$EI6X~Jd`]]Ҁh:}H&)Y:[| 4a0CJDAB$+OY$RaBSO#\l@]^6pkW!B 2yk7pr> ~"92U>Wj;=}cA9kQVsؔZ /> 󁒌}^x>bS=A9(m}fyf~ɶm$GIo+3 )ŪǬ$P#*pF\MSѲ Q=<,#)Q3:gm#ǒp 0vK:!j0^e21؀\emy2aC+oC)[I3:봠2Y ( 9(O;As}X}Bbr0T4=HI4_R3#AᾃT匳t`(A2ϒ*i 8Z~ѷId:2L%':rۊxhn&~\"/eUGx+sFf4F.i9}?f>^TxKLaY՝摞~_ThO?ݥ],]t(9w͠eMNY]cѪ2蟗D|f[b17E9u̅~>aB_oP[$^z_HbgXha1ؖ7~du|s Kٻ߸n$EY4'`fdfeUsy2)>\f޸\3_GWj<>`˵N_T s`2xA`g0OK""odCdE a,]8_rGrWoۻDB[s sgqw:92895}4oYѝқ-9*OyB>,eu]JL%=R c!FMI%| GZfUk^3D@E]Ƴ{,@zOC>hvQ~.{2xtCe o:m?DѼꑰqy׷# VgW"[lCh!IoݏSZԉP4KHw`fl{ ENWw3SYB/dv9`4hrT㊅h˪e((r(y6Oo]<%< (n?\C/^őitw+Q+x$ҿq툟3XvOcr ڥH'S4B<X13BPr$yR4[D_-B:DU}|/?kS6T|vD{o2(&[>|g9B~*~]:m#jw+Fm~X1jQ'Q~=?l }3vuPo SZkDw܎h]Ƨ5VM(O<9i\]m}֌OgfS\i&u5r:ATbli'|3Jgfۋ(킩fV/݌2'c(8j&h)t#s=}aȸxz[.Ilx,GYBΙgu#'[6oͫ;gvB( q9\b*W^}BîY({,9ΙL9 (=xg73wÎ{)L gcኲ,8p9 >8)&92b"`Lv눆n>Q@^ҹw1g ʙ`< NIQ'E*[= 6Ѩ}:) 8;Jm!zC!9G_b K^h@!Fިյ u;A>xE Cl$͔ N7MK͉Z09vԶ ^mZذ Z֭6)/j|'wypLRN8(ݾ)p<N࡬~?Ghh*;%̀HdqYrD;/ k9I_JFJ/!e hy&õ'6rH Ӂ+Tcc;tRnMW@Z-?&.!>~ռ]\K.q]j~ѻO.|/*:B䂞^_%r.9S_nH}bd^Sez~Zx>m~7gBޚ/ _^Ii F]I9Yt_zwReC{[i\{٢e~D3 m 3^;쮮 (ʶf K hF-Ɏ:3cQh8ˊcsFYGSJ3=}xM33":XRI3^bK" Df0rE5_1EÄFӟxe7aj[Oiqܤ+M!-IVyVxoc!N5g)?\^T1}5[kJd.O:h>`ouavZi|?^sX֫ۋ `[gXq(;ޞ5}|{gW-݅y<]i5tu~>ɽ\.u.W+C6 v#ƣ.wG\ڥG\Z]ێ܏-d8·+SP/RU,>,"Zz>[(K?b^/ļ<1/?ftJhd}IameA:/e=,3ݲ,sXd89nw:`8L`b&M L 7'h1yn4Uwwn:~SNͧ>ɘ# [^En(GQsFwΟgazj[ȡ;7twoX f*൮7{<\OHqTɋX=ڬc%|3B6476{إZoxJ=}fWLXjy%f#rci L,:A^k O%PoKNiٸP5|ka8X8&_ۘR(L 1)n"vtRp;0Oy#\2k1S`c#?vUv^w rp">_9'0&k%7e~2:_!J.p%0kKF(ҋڒ6dWqY<%Ʌ[HA{&0$$(@2K2W:!gMۥo¦]/c#fmo 0xl9x<@(^E'wA1:V!c!7WJS]u coRΗW7#o m"4J+'+EPB񮐢y5 ӼkgY|ϻ;jOG1T yKT2OȯjM>zjtvV:;+Κ\B+`|af=ꡈ~[|uUY+?XVb[)Nņ=}6th{ocg*]=ԎJےJ{B^c_J|NlmZ0w qSAQyo͝#/xdT"uRٲ&}M54~R 鹐A.z}DsVu u @TxD#A+<2WQvjtyZ"n նq4 /QJ9R߼o2vd)FeOyR*x;)D~HTpyҗ--p"Pjic0.蘵H!)LeVR`"&m1MD6TKm@ )FfU<沟J֤\HdL?1:b E `P-=73 e q\H^bFGK M9'et@5I# (ʲ Ae& e=).V U$&Obg&`f`:!FnbV:ǘ`U= v1HT 4I wbՂq&.;KqP IR:yYL%qeHb{HyT_!d/ "jFf ͡7$&rIDzh)‚ kU!u",H.e +'xʶ=\|A&$2Ys2S: NB,n!<XgmWf'N! ؈:jsGr̿&%(Jnv-GH`'}#Ey P-Q PC1pOIƖ^ D[!8X%WZV# KDjJ" ʓmD0yjcBPc_ ".:  ;?_P`oaRh/J͒}1N~b|-5(: ?;)WhUc*-j:$by\1\^Y'^ΫvE8qE>D%aZQ+=- azUp]|"6oa02]_Qi[#-bP2۬DZwI{A:qV;U"pg1ZnO"Ғ ŋt٥a=.ܺ@.:\ݹUpc}Zw^ ֆaLXN~]60(PK&4֗kȺj 2r(> 9ML%%-;ˌ t kss. JTɁu: LahN|T 뙩فknY]s[|O@%OI{9,/~q/[7&m[W]gxlEWB-0LXCh\>$-EYT|}{wA7'[EXnE̩=+a jl\}i.|x#󎯁]%_b[Ͷg)x^_yma:zYO&.^l_Elv} u`w>H X|xEMi.,NQ6?Mê#D_+OYuF,7Xxɾ1Z0:"z(I$dtۊ>"IaXGNl1w>ˇ?vm|bQtm2e.WmD'oG糏&^qԧݧ%_/+Ub>-N|Yl^|ĩ@+]!u٣.7ivαFf삇q^l02W=/Q:k򺭭6A{yQ;1 ?r7xi}i!)^t6+^?zs3YJ9ުhqE#2j\,cB-I^i,Y>OYs&TXa0VpZR)"܂[S6GMŅ#Ѫz{hJFpK4!cm(YҪ2 /ߐP1ƍPoج&R*.vJGnw_53U5VUvS)1L*W;7jr\(jxvtxXUX!lPR&DTMʧTՌަv$1 ]=8foʕ]Av;fdHΉtO [x#xu;\Dbҍ1ZSZѵٗ~T/rr MՇШs~w{Mn7DOT ƌzo6{+kg}p=yhV}_<4r] 4(d-*ˎ-QD Mw""yoI=P;,nF|Z]r S1!\N϶i|^ N_d1sݦsä:TR21J zBz8l{O00ny 5=oGI)oAHCPlB(IK)oxlK Ö>)xҧ- => T-lJF`$FJʥSA0Ιf)hp2Z")0.` ,PK ep)TXL.'@$70LHʣ1YLPʶGGIJ{ODrtK"9"p2AI8iF h x[17wBP@$\$* EtH: JDKҀs0 lL#b̑4F9R:@e;< I1B0"3 jX^EWA$XE^brMm54"J .I-SDJ1KM"LP>b@S_K҆ Z?iEn Q4wxn w5 !)FɜB[Ƃ)Z#QD[B!FN>q,,av< /m6M[ .Gi)h&PURfRZAMz4=m)e RƚIi`YJihYG0 R(r G*t *XLu1v2dH&',+3A[֚0fbE+F42%LVYM. `nBhY#Us( `si Rkk?4pg7f S{ZB h5rp1'I7D`<LH c-ǤQȵߖuyJBE2:mn i^-~VGΆyp S_rlԨꦥ6iIVl>M/AEFA\;c*dm$Il4ЍrxnV6{{-1gF{uu8Y[uu(ڀ-!5xpe#/&lB->--Υ j_b^^ܲsL-\Djf%J,Xـ\(jh32k?5$*<&6%]0# o2DCΊ?cUJd>"Mj6ȮIR Uф.2( qVL4I1Q`Q 4 (KI`c[T: ;K3J/< VvI.ň&ԍ]3>Pc DTk EOsdj!<]L1N2&@Nϲzh Ӡl.ɂ'eUa<Ԭ0ؚf+#YQ:[k|" .!AϦ=0Dd60.rl-k:l /œ:S)Z; Į@#0Sns: x%a)d/\ -WDHT.ިD Kn}d2# YYUrrr1N3ixe`b` ZXgQ}Xd)"A0SL9 *$:>;KˇA% 65 KS0d&*"DSR2l AQ52C9juQF &ueeo m $8˻Z(3qƉe,rQP j48h. [d%x;t;9kwW}''_Yrw_+f)u6X#?{5t{S-X1zלw\qejUvp 8O^S/,2 :-`xٳO LW b2<4Q-P&Joº DwDM4hZPlQDm/ @:p6OXQ,&  ,иyu[wVBGR.+S(3;,;{b%9Y'l6 =_vdCLނV=eh4FΚ^nNE AGnXGֈ#R6N H둌lIH!BZʖTɡGW\l!)46fNbꭤa4acEȤJmQRR5 m- ,Riʧ08NVZMF&A#aVK{aƍ ۗJ;{K8mӥUx糥1i˻Wim[* 4%[IDQJY4 ː4Aፊ$~ g$k14ɫ!Nq)8:8a;d/C&0uaىimG01X;);<;O ;yAjϯUMX=t6HJM^zeiQ=\1f:2Uur;0=3oSdWk}&ª{46$1Sf.ͯjxmQo^35ꋵ]bT吺Le\, 54ng@-"o WВm|~lC1J_FM(q(hl&K2N%CEx%j:':P,(!ʗS}3dόLu!V:Olx_mEV%߭CbF&rP+t͜~DC/| O20Vl_NzM!c;cM?00J{谑imC'~^3:N.]T9rtї;&U;JIImBv\d>V*R\$}L}ѤyS_>S$-i|w,iZΊhR[2 j벲DG+SICQfm@'64m  'Nߞ(H1S=9l):"Uh(xLNƂac"z 2aKH̞'lu6jBs8x?h@c U /pIAEƩp*d"{%S1pH8w|}7VB p `ɝ8w|$]';u tQGTAD:}! G}CLc"pB$@¦&8B'qB${K6IA\g9k"`- Ild-rad kFQ/()l$' 7Лf(K9tp#4HqhPƾ@m\Í1RZy9_^T|yQϗH` `:R%XMd\QN#OCALS ɒ&o9R8E"S_>Wsp!0xcRCpHwF kBrze :#$ZΚնFJ)5`VEp[lDN#49.i\f}(|?kĢh%l8F (aD6D%_=1F#!Z끽;o?]9fk_G-0mP'Z>KCXx?=b+Twa?j!뼸AOϕ`'A𯳿sޙl[QEǾScӈ޿B\>?7?v/<Μu<̔ ?~Ϫ {yG(! -BR8d(}JUP˝ =#j KzL;>T?dl2@7{6_nw%aK<;7Af%N|#?N.jJ)Z);?(XU*j636D0#`(Qz̊8(Y vĔ/@Y<YDIƐ 1,Nb ~V9WT J+0؂B*m2Z/={|L5h%&lw"!&i3-Z#Y".hg Gv1Ѽ!&CXg _ӑr?$gҎn#4j Il= IE"~ĵ >EtusWÛvcI,x#*iIGfS[B&bVἡVUR$2DmsAgr{|x>]omWCK6&\gF&hghGx=,Aeo6vDAɘz--kSXx k]\'(^r+!3z}XSsՏO/aTcd${c@@JPms+V\jsŻ;?w+~js87юҽw)n')i՚qv`_|eޥ$7BdS{)S fB>y QHe3dА{orڵv,$vd40{Eu3k )F/22RXD)-. }me÷{:DHҤvz TK,[? @' N9 W&h @D[~ !,&9V$ S RDb"(r?ȱ$Ĩ)}m7"Nr$^dˠz)lnyOQRWsp6FO}X{P.+X9g}$Q{ >$? ;Z,W75.}ZuR+؋#a4::_ \\<1ٸ62KDJŢ7HId7a <q"Ŵ42/KJ%Y?@IIhz32VB4e `+=D\9lTedSC;D" ZY1=摒t5!ֹ{&I Dy*]^ݻQ7A +3N&=qS*)̖UZ.!Aa^ qy^1Gr*<۹  sP軮]Nb!#t'M" kIlɷZcAZb1>dPMH5\X=iZnh٭S߿"AB^{ Mr:KʬTn6;=n!^t?׋n^jթ>̓G',]F!AP"KÂNk7~|q06dk rC]9ַ{l\ӿޯJ_׫g~ذԞ'r;HCsҩک  Furb݆Lk[~kHօ|*S|кV[.bT')m]طuDiݺА\E7uE\Y@ .r:u~q(We|Hݪqh wGb0J(&&V*=VbaaكR[ƤtPhgX=5|YKl`Ծ\=|Sz` 1hLJiXAM%vPys`9q1(QKI&uKpVS^%LP% S2!KURXV0B `x ["U0胹j)Th5XŘ{FvIlN9Tɽ%7Zy7p*J ԙ }JΚTkZh4dr=Qg$%ׇZl!6ށ|*S~h݄A6I0a?떽hwhN{jкi|[.bT')m]7떽Xօ|*ЩH-c`dYcȘ;lF5=\7Yh7qo:3slnb>sgfc9W_6>I-V12 7ѝ;> ,7)ПzQCdzh]޷]Zr݇xeFl;. G^:{%^Pz^Hz0"MMH@()Fv(DfG~ Eq>MNI" Xo %{4_)Wn`Ň:p z0ƫ ;@ `: ჋KEB$].9=V lTR-Vkm%W3tkꕽ<%>Quҋn~tDϮޚI1[Qbv npxnuLVO"d1j\*9JO e1o-]o!0GXBS6/@{ނ-ǚ"3fYO몱Yy\U1ַ b' ]*SkS_eNFQj"*gM&`wnʦ#p !"~q:@g&Z֎kA,B2. O,/OШR*!1O{8np1A?σ~subvq=Js4c@F9m1)m-c a2DLes%;C>6o}h`5Ap/>fV~xC?pBI@GG*fG͊{:#PJ: F̸ ;|T(o5BH#ఽt^`!+gE)њ vՄ%@J\v6E2M(SXW&,E+po\bЬ5 kОWʮL: K2:i$QfݷWeI)ĆEH Ba$7Z\iRWˇ[5Q;;`;LiǨ[S|( -k ւY/J0Ga=܄]pqq6a _W<ŧqi8?Vojtg~%W˓OF&*z]CW׳ay99>DNO3//t(nFW}aSȀ֬z~loL+wg2kZ޺ȣkpGkl闰e^5]*3,s?{d)* UŃ;}Ų$sԇh6eh,`֮ȀU!ϬV a0)$Eyg#[ TZWUV|33)~pbh[?~/|w8{]F<"'b? |=rdb[,dN.9o'FYj-}Z"!EQ/j\~%֬Dr_{p(RՔCPi HakXF _1J4FxbTbM-&L#\+ha0)xscTq6&1qaS;dxƓuE+n1kSWm(^eNFas]ޫlZɆ)^<M@9(򠿌;!x%WWL>Z:on_F*Vk# yĔbRȂqT ${!B+r Y\W\ãഔ1?*h$0s2:Q (#T2/sy cTpD*m&q#D1!Rya;LvRO +d}܍)xra<ЩgtVF95 te:B MMv?rk<^7j+>fCfno\!5su0_Z>ܹ_U˫FKt Kk]׳CT\}zwKt<u pJҤʧ܆m*Q+r[ WC]m$!:SB_Y7bu* }GG+TNg݆GeZ1 L~sּޣu* }G]~26<%Ӻ?8D{c k)wZ'l)'2?A"0Ӵ w <+ɠ4G*1[`":e➱hj 1Pyl)>fScfSuiF 1P-jus2㠩. tR, JuQA#FAiUש>FRңF)PJpRר>oQ'9JCI-:5,xB12W|ɖA)eTGR\ &in:mjk5feQR@H? ,s?1 UTN,S%D}? U"|!JT8LHoEԈGwe[-[_ }S"Y7*Q? 7k2s6gY0C+ _uk4Tur@Grnk`\rom 6%t [8H5H{B*.($~3%u( F ^K_$աݯ Qa"m/އ]TT>_kU Y`@FTX.P2RJ[ɷnC9{Qv"ڴө6Ե7Zg /Om˛<͵k&j$uZFT 5NkYo]:-\c#b2Uj<5fJH-NK=Z)hxMױc7 4:F]'n##Tc)rn2M$38Ã<1Lo\]7nd5?ߝ êź,ٿR{6PX-.]xuv~P_lE_$[Ѵ낌$$JDž3gHі %J`iL{*/nVuz:Fl&/9uSh^/ T܈Q7O⒡cDv 2/,U %RYR_zK"cJ07e 2~E* YgvlQ{kfrsh"8`um@pAAxpʁ.EC(7FV\)GHEE#Sm˒Ʀ!Xf=IB {q6M(z/I Gt$5 /v#s~Xu,AF%f͚i\.?=ܵ f2sLN2)ݎv` 3^J>I3^0!7!v}`8aD.93;ߖbl5z۲cm5q:W;d ǰn&F$DSf0wØ3P&+sf=,.G_VT p5ޠըOFYj FX Z ^Iv45/_opte'*46*( rQT.0'T񺶷ҧmR0Q}ަ:u*?QAmuzר>oSYD GKs:\񁘍bt%W*2W8ƬyZ6jޟճn~UIGޠ1;^80)ɤ:i+S1LU/UMQR&G5՝raaHYԠd")?)T'5JP uB`x(CiMu GRaP*Ꝫp(&5՜-QJR_#Kr(%C)k8J5J!vULi:@>?7o 'ĚZը=!:&w\07*#cU ajh1맆Ѵfp+`6F^=Lh.[t%VHqMD:noҍhM͚ ]h R_|?!)WE@VZ.hif[f4a9Sm Ҁ2IGtd$4awS-^!ƛm1JmQ.6<`>dAo`2rP\ 6^q=WsTVO V"%n )RPz6!le luYhݾCPk\i+/-,A pt6gQ> *rSWm)Jf ՇB"H2步ݗ"Ry3댼y+2FQGțtt$ WKXqF\: -Mv7A5o9LVh B3YF,~A֢ Jq3YFpQk#NaG3*|e@4jAB*NOx1'2%3RTL2WATLJ!3}J9n̓Q#5}TLŜ E/mYZ8^ z+9b9RK2q%!"Ni+Ltuͽm3SRIy%kjIU`_ (8q``>6Hž쁁;g" GD{JAr$`#fMJž|埋F+uYZ,m=5&]t[Z ls[" Q :夛^T7Nb4yEq{9E/pr'>/>/>/>/lv4q6Ud3=/y,)9XTJRZxT7ZgwA(?fKg߅6D&Q%Oc5?ף]?!rY.CU/_ʓvYLI&i[ЇyyTcZj*{3ҸIe?(!\Yh) JJ(IWH`QL@A@A ),C* ~-4)ZF#"AuE%+M=湄="3]W!GCL:+kpvQB:iTDJ0~)'KN1A#W=nB$i`1/ (BtᴳVbHy g@sqۻqiX4]HR2ܽu>,-T6zϥ*,"ЫB]rAde4!>H,d΃_%zB _r dòG(tre4KQ9:S9z40QAmX'Û6l ?jGZ9Ynh/@xJoKْypAIlDMi=ՆFvp|PF;X 8erNt8ikJ]1gUj룵U@͘gqL{UAYn*1bftBl>T nx|;w Ou$nxB3Q/o QZT-ZYqFrѶHfdK 'LVH}iBkϖ@4\zÌ,Ҹ,$,H _xǓ4cʛy]l$BLxQ?qκ(K5_?~א[wS_ >`zo1|o1|#,i#|:ݵD: =5ACBvì1r_xv;οwJM@M4˦Zw3z\ bL')mu&d-?һ a!/Dlsǻ1x\ bL')m]i"MMI2[٧(>E1OۧHGvқ1t:L? )Ό!o;.;=w? 4e1:Se vܹ;ñ*k1TרuIN;xp9 =Sߓ@ԭ1df.gpB#*r(j% çj)OWj^$A`[d $@A&ʫ_eR˒ X\_x),ȜX pY~L[H:<'r6hnYb^ׇ!' Ci"zHr/a꬛^Q,: l~ l \ϋSiۚln&/آ1l; cusܺkO;|]w>ߵs D]Q* )RCRiQXU1)F  9=M6T.7d'7*6YBnfϥ~y>xA3+)3;@5*3MUVL ՉQhlFU M Bk$K[R+G+Sk]Jg*r!CՉs} g׎d$r*@qd d3,ZK+ZFO btӛ8׸I plZAȒ4*\@6dII KЯms8\@TYD0Te+389@`3E`e݌@/nhVbqK)_fH*ü<@ dG:Ks*Vj*V"Z)F#Á䟕ʩ i~UVDž} Iߛ7R+o_ RѺN~}9-{Y8.mڑxi.yG4)EZ)Զ}m栉xΑ(i[KώRсw<uqөw(Ew*"GJq2:l&琒CޜT=j`=S F.?;%5EV Hi:MzHP0&=^^ŁCay;SS5eɥBץq 6鄑1kGJݥvLQyNFde>w EJuYu.Jcu~gq0rR:jFq\>LNsBv <0 K{﮾vp'Dm>uOZW_nm;#&g{MapQ9jRTI^J^-TTZX]7mC$HF#h}\7 Z/ZO5Jy8%A mpg'"1+gЬ+E1Oi⠲\ /*WdW@9X' FDg]`T]ph3w[m>_}~b9>H x_ ?@hX'sc~ epYo|hho_=^K1IDzBHpc*nn8G+2dB[$[șۯ7`OYa`4)q| WhAzJ>x>* rț{Є7G-ֽH,A}; vrQꮝ\̇=1?-'7M9֟\)Io 4iHtJ8g>ocZ/A@s:{ C,a$$]ifq6gkJVT9&Btw}(nl$D>j|]ChM忮>kzfcu:g{vqS|#liCz 5O396Ql]M=718޵ȵ6gY 7,sd׻C3z\ bL')mՁשy̒H6pͲVSc]qx7ew trߑE(j5ۻ姄Gz!,䅛hon:+& 0%r6#vO$"s[A.ǖWcsǸcJG^2jU:gcRّ[I6S,^s.JqLs)` !KɧO08[ʳt VL(PgN>]Vr\j馿(EK5RaW%3Sּ9*i0OUe3m}. 3_ƴf{|~J3͎'BE|dͳ_JIh&=̊S8&=^^mVj\A .F (vD 4+d-kfL¢ !c\G @6AmWDaTK6`HO6(k92ԕQ1.%e !*GJQH) aY*i?^TҚv`"ZiD=lei1Ɣ*Rk * *Rȩ0,%9Ix1͎HQV1z];KW$֍AHT# .t\2%_J zI0@ϯûCȵ1/xr!<=y)$J5{'l3ycSC%d:}HR7ڠ60IJj~y$6˘(bӫEd6 {=zcД^mgՏjגb>\j{^g"AG5) GsW[MTes' 6UHG" zDݨ ec"h\o}5Z0 ;ͨ?gR nn@ǁE\2 JXmJpHIMOu J~ akmO 蠘nL|3nCmP:CevJ>鰽[fncD7bJ8nS݆ڠ$t~6*%f͹ g1"VLQaLQF췁跲]}跒7@Q!1DZ*6g5#`~ۯ+V\~ xo_*p_ާ :i.b{fcmi5:~i߂}Y)nCl`5 ʽ?|뜴/wnC]}δwKXL* nH4yNPZ | '}(Xʇ[lE|ax$ 5@VWiY)4& Ea,2)e\fXdF^%޸<:(6v}R_L" ei̥nIDso&JjntB1.(%pv:UWIxHA )\}dZRSaB9Bآ:e%nGZ}Di~.g|OamKw+RXRᚍR`eQ K*\szsRˢT%$R_gEs(U$tmӋ%J:JIE)er('u[je $@ Ӑߡ#Z?lA!3n%$jLf\G⊶n%9V` hf2LsP.i\4 IJ(@cNoUpa~v[Gi5`T5#>~XZH@ ^⮲!`XGVr8HpY~s}.G0Tz!đ b 1>yu#lb /MѱѫOoWKb iFäp(ek3{!Ġ $,j(t靲y :&Ӕ]T{H 2$ʤ&4A"Q6R+.u9-As pQ_ԹN;J)R3>H'ܔERjywQoĚz#N᮴R_f.4R(EQDnK]H ʢp$TY.Mr%J/JEK0RLD꾮Z۰zNvDmt3xRA jQoP@5s@^hƻ: +$D&4@N1NDb>w%QXiBbueUȥ$Eԁ5Œ WƢ\|GBc|7| ::D A(&@NwΓՒ,x`"P(P+ἮO7VHZxØiJQ\B x'qf*cK^C'5sCzY"zYr{U{7]M>PB6g;Ј)`w̅TQ6=.f[K&׀FOGob5lj\So.ެzɣB)ԛIKM'!%אqF\:Q1yT`) g^:BUh!86QotۏҢbz3\`Ԋz#8+ BriPpגrzSԛ)T~ hjVv(yF:(:,XfXtZ|T%#Ҩ%EL޷629A3N id,S(X-HcR]JB; iI{W4F0D6Rm6B w2h^K0#>9Zyc?Vi{x篏 pi`F80y w} (MG;0&(fHS-%q yl[|h(i8?">PfZ|_AʀU jꉈʪA؜*s9kIs-2^M>,*o| Ѩa\e棃[cR.CkD#R&$P#C.=M;璘 jt? r׳dB' 7hiCg%Qw1820(-т mWlzk ZURk]&h YPAA߃ +1sNe{eyF8zb'SXG%gB" LLon-WONw4ԕXY߇Ws>rj$Mx4ϰ;Ab!ۉ[|jW-xբwU7?YC8n(#yLmOMsɱ} B1Z|n.(hN536骫.Ζ_Y1itTƙbh Qh @R׺{0 a j7H9!l^V$l]ƪ?ֿ=D?# *$%hG&%P&P1.JSI]9JbϵOJ(yVR"s %S=nJ4X)0]QAuqRqU#sua=bd 6 ru78SuKǻ?fBZZC}ys=?م ĠϯcHJFۮuV{uZ}1 =>h++JV:4gYkԶ{$.\I L/2SA:/>#Jco(w3C 9dcFv̯We,g >$Yj"gOzT<թ8~}˲DZ%?/dD캹W2tJ-TimMH.:-pYID;2u{,]eFVde?SR kbY"XrP[dׂ}Tm1a"  lsMz옲;b60]+髃߽B`r|F,o|z}]65.ncNE#ڔ),<&- qrdDj49Ɖ`)i*АU@g+r,j6ZF#:ځVT,DS!sJAN<Ռfc:$o\.v9Hww2c/կ_m7qYvvTr>?W\[[u4䍶]N携 |>7 X% :x]cP sƩ,B|օ.V/@\VML(U!W^GtBzRZِTLƼᠿ6O wh)=4=vS @x__3]dלPXɗ/)ػHrW LՐ+@5ƾ٭MFRcwn$e]LVU͌+#H]ei%U@{䋯:9ܜO=' ;|W\UIغwgqqRZwu8=xݏ?l釾Cǵx_˳[wu}j56d_Y[\o[6%{o=b/z򧼤x~'a'k7/&69C5u ~;MQ\r]&t_/nic.z.4QYk:|xb-Iyٹ%gH7SP*mh}RAY7\||Ey ż|Ny{MNYt]|cяustN2]w}WGa2껪(*+ w5jp뻪P% 38#g*2u"g*>eT]C\+ɕjX'yV5|S-ew]TkBuf $-.Ռv^ϴZGvyu<_9Q;ی? ϯ\G>u9?o\ pg_w&_o"fǝü|/ZS. 7R[t ~UZ1>t_׹_vmqfV)"T{t^[ڔ:Mv` y&bSG}惡wkA>ƻM::E=wk*nCX+7DŹ8gjޭU%6.w7O MtǦ +E]r#3T$\7zcrW5S"쬆XԼsx;L exP3!&zʕ8=Jq%ьo]I|<kUtc啴* V,||EPȴbgdqOBni-y\Hf)ʑ&g騟^'ھ馟n#/75eqSCZtq̌5̨(d,5UѨaׄgK5œb7s#jT"LR6V1,/o+B \J-,$i@+j{H’H%w8XJe~;;cށFpC"ĴpvKްӯ`;B@%s,Eߑ\}.Q ,p}{. ^mz;*IʭA:y'lMAmNM#TtHrⷕ2z)Y$)*2,0LE3CPY-hfTH9G>sQ%07m<%S&.tS8q03'%Cf=hBqzXW8B4,oJ'i{7<!$USoi!yN)vKEY{bt] jD +!r/dJ BnbZ͑(ODT.EO2|\e9"=kM2TYpTay]Da6&zme^ 4Fu8F5Fnb6c/C WRH$4RPSbTP>GOJY"L *#+#KbS9fƏ;:7#^(Cgs}0AX=N# 9H)D"FhBĨ͏p3@rE7g KWYJMUk`LʐWO|!ziYpIJRܼ$` !:y1 dFp>(44wPX乀kx *h|ڶJ:#0[Tl_`KJS!hYwٹ]1_sB{=G~LpR=C+ #j9mYx]<m@(u7Nգk0tTiB2 k}ԨȩwT~20(AKUg5r}y,QQ)E^@b:v'=cz!E;nU}ۃl$Xm#zew8C44S-?n*:%\WI?ը{c+K>Q݅QEÍϨŞ.>Lݼ*B^6ٔ䳏|Mg*i}Fwp5 X⩽[{)ZwB^6ٔٯ8F&n2(1gx wI|<\B6rݱ)QhSb#` MeGϏMh* MH܄ hhV YvOP4Eԅz: ૄ,e5xAʆ]ݨ(n,~Mvuvժ3Umqv)7H_+FfJLG"JiaZZ+JigA0Hȉ] ]W<v}@i`q'B&Zj X@ɫ%<X:}F; |?/ײb>E(\zĴp&(G՝4@AyQo:#SD=7Mz{@YCh!t)VqX0(dɤ8EH+jQ"D2\[6h <[&L2v{`J+#[f0p[43uB,HT (>5 (v"(HbT̳!*o?6LZH)mb7s׎DŒD@grVx'e>9:}z B, ]_}V0J.. ]1DghF:Q9RCV'-4x gK%YQpR8fE0' KP@ Í/q{['`z\!$+ĸ@ F gL+JpB(1EuT/Jwv4࿘_/,'U*CsBTD`hSHG m|08ߟͺUڃY]!)HfE_~`k 1.zE/iL8m1)vG:9ei%j::MY+XD5=_h BmF 'QʽEJH=;W>4el^R3?{}q1@qU@0f!`Ah LN!Š˴9ڝ&(o(BA1:T9)@SJGE u"`DfhQ̌+ݓAE(½!/x,8MHV9E$ f-DɄD.AtJ- I KwF~КE#sIi)eQ>*I@nYE*jG*faZQaUNE(a {BJ Rm,J\ H"7^k-Rdy:Vqꂥ/|B8SP]sTם$Jj$dEP]k Nܻ^*PB ^uHJBɌ&HY9T׃u{Ht>kȕ Pyk>e"&SKV5ĢF(yPB3v"bSLa9SG D1y>)PPA,c^K4nwU%- n+TYC<㥨Kg,Rѣ" 'Zc\Xn8csKj4>JC7R:'tHZ7<(ƀ7٥WΠN]4R&;>CO>"}GV\,o./wuR GHLS6O4?e5 ŷ^.h\dLy 'D7{ c#/?TCSBQwgΜqِi3KV{c%S9iI8dYcQӷ.o@$*Lp #]b>HJrȆz;UQ$=I\ߓ}ga7r@?eެgo^ LPzPg'&.Ve*s:ҳl&^bk鰲cO+KR ИV_z#dIa8*}:xZ)[|_Wks%{11r%YO<;lv/ 9[#sƷoVX7r;'MYY.}cB+sƦKr߹zUz֫ozՉ@9rڈIɍXI'~3S*XK yȕʠ_0k ]RX˫Bnv9;o%|Lq(T RSぬ6z!&C z<;8LXH>%뼫:X/VKTDWQSD_Η D!Ey^J5?{WƑ _6{Cw~Q?bm^\z?Vj^zl%%8!EJ=/ڴ+jΠ4~'s'hI"KIbQ9r`Ekԯԣdއ:~gw".oY\10sOɹS=lAHqĜ-29i0Z3]S5A({DLΪՠBX~tvڑLRPb,W ձ5 +?mTU? 3~Bz#iy^koF.dTD* 'vv7w ~EZ)\/,h֧C̋ 7qt>*sPtf xC&Pj)vi%+(w ,x,"SC=ٓfYFWbU=Պcٶ׏]+0ZK;ICqSߧX7J#Z<QhbFXc[e>[h'rjSF[WNw4ncFT/}lֽxuCCqmSʞۜ:?4 ڜ2,c0&,N9(Y7 ,&x8a v0h >jaq+*5<}bM@ς@ùs5S9UHc fEQ(Б*Uk^I U Y.m VáК1.SUȳ@H*á@[`fROFO($(x$hqMr<h[ƯCp@u5 r4&vNXǝP Y`+(w|Wþ6㡨y'|3Q2Z %sT[wh vڄ^ qGmKxp&mA]I$R]yrWM#ZU"Yvl8Aw(NЇ|*Iw֍j[#Z<QhbFXb8{r֭ UNnzO5Gn]y:]ĺEhA4u(к!߸T`-@s&w=? y((@ c ~]9']4)PKiNCj0XR)c`-ǍP$gw+SITij8nܭJF5adCsT_R-:S^[B,zZ<՗TJug-=i-.LK+V[R´ -݌N[Ki*2qZ0LKKњZ&LKl>GV(}OYKBEo|.MX@bi~F@.BquS#rY)=Nz7K$Du7HOaHt W/x*YCD%>WC(}NkyLK]*pTRNz2Xٌ,=Pcl݇/Cw'oO8kk3 J_^_͂z|ؤ\zi&>UidE[/p 3_zdj;)34W3wr?l)ɔsj\&S8۩!K'%ylc'$7!I-R cYeF$.3e=*-@>BWvA3޵T_Pm(}V:kp-uB)Oy5VLK&DqFF9Rqxj!ҔbI6ׇK4eY^bH]3;w2 ǬαJ+Æ9,`Jm\m3f_AC+mR[#&ۛXtjĩ;Nu#&*zct c蒬vܓγ2z9m8sNNWyzQG}_$No@-r>+stѺ:̶R*EEFY(bֈ(gG ';>ҜMeQaVCQXq&(A!,Bg8GMXc)j)(O):ᵥ.DP.T18-}]ezSR0-Rih)@n6g-=i- 멩ʞVҰRZ6LKw$mTCYKOOK1;%¼H*d?cyq@Ӣ6jA5<|iQ 5. )X 5ՆzLaj7ã s$P?!Ѥ%N#!,)ԌX`|` g~ fOqDy""LR%2Xϳat/f8Tܬ!B^&á R}J9PYFdvx lZEgNq6o62`9/ZбqoCt2$&uyE}a@.b+}ll%8{-]Ae@ D}ꍝBa L*F=ikUj]RoU  eRDWC!Aj&>+8kS9vaArpԛL NٌRoLcz484fCF/7$R)?bizӈ9ZvFL$GM<.&RTh*m zcOpzӝ`t9[&ec^B "!ejc͡zYV \)r$laavI5礆!pBj%iiI5.tSReôT0,bZlTKsn;uZ?ӠlK[*xg t‘˵'28?A9"Is/vk$7E2lmT+ynuk.D\ǡ0-}]%rZQ󑵓ҰcskiX* 1iZDJHQ\w>n% )`AHݧU9($-k02e*mNb!ƹT4#nZtFCd:0s'=d4셰7d8۵T+a(Ɏػ0xX+:1Аpf QM0)!PSHWLJz2Hh fl&ʬ AK*PD7ULl,ǗȩB TsN?Њ{jڮWZ{ʰZ߁ ,4 6޵wkFL$4Ӏl§3DeXDy,_F=â=3,6 0XeX ͰؐKE#sAK F̱V"װ4c6_ŎSLqE Q aA"AA I* &`3'I3,"/$r$Qb!9Yx!" {3,t ޲7W,,ݲh8矧YH*Hƙ9O36$EH5Y/qUM zlBWH ?jm/B.fm7YIOyk{zS`m~vs[pͻk f.,}X Vx1[eage[ei!)a\ %yljW`.|4/$J '<@CA_Ǟ?v:*M%"8} A+ڈXdٞ,J:{ذwC d7ͨ@SLZ.Ÿ}y}5e%۾݌a qNVRS?yOhi^jb#k'?sow`e}gb0k3} 48!W#Bt4}@X- wZC )yv]&Qv>E`/[ te {}5s|򄽠/|^& D4dFhIPSHBrWiTf_zP)2K} )<|I=2#XIW ;8N+1VsSgbbQUnjh^=q 忯sQY-}X-k*u +rl&Z˂ ,2rאkW:!^#8h~K5樕_c}=ԙ<&*5fTGqJr0g.uX`/Twc*)4Ÿ9T[-3 b/MhVJPJc45.em 2#DR,zjw!raqsPVjv{/E~y>(*Z}cގ'rXBRۆ1{Sp]KA5:nGTxBa~P6`W9r8`eCGyto_j:ViVVGk&Lwmi8eX 8A,yg<$~ȖlZjlEn_E꫕M!(ܱzk]p4 lb "R}($6^8O\L^^R뽯A,J|2>l#YL$?D*_7NXY@N8ۤ4$vӤ^o˛ A}l_ҼDZa'dg$.̘UTSQpȑ&|cSY*23z T:/x(pOV]o}M?0r~3%)eq7Yۧgnŝ%Q!b6:+C,.n柑Brfϲdyf0'kdh΁/vkX컧ȄktƑh n\ƾ*Ð׀ؗ @U,ὖ@ZWY."k$Z LV+3(5sM^L24tMV`3en{N>P@hKa%ߝ_(9f曉$&N/r}]Cdݍ ?+eH >6, dt.=mx7 Y#Tb(Y Wi*wC}L'>[Ғ A2F(yBazb:+bDaIq/ f$no~.l{h8+zU֯XUi&4sλGQ% 9\r7K4u^rĖFbxzk{x|9IVWcAZ( Õ1ySNNqiiK) (/^kcd&NlkgXb;O2Si=S0nH2Ҭ0y %x<6Ay#miT@Hݢ_F6g{Hˡ{tH^rE8|I'Z N֊"jW8Rwzh+}|?25q[qh뛫K~\wu*xoO~hO~䇶=ٽl.W:fKV V=6jѺ+[Ճk$rEG=|[%OԎ:1mQ'&=KSĉ81_r2nLO.(2͗h%:؊qŷSb>9̔ ds.ĔCxB`̷(Otgk`~;8mGz>I..?M爤xt%{=®?*V_|po>FD!!) A&wcj@4T Ӻ*2%JIT: ogY"/S?L1 5EQve#+^_R5> łO5E&^V T[> 2Qwitw!WGm 1lcns.6Iml1FD#qXtdnT"8j kXnGJQ9xtEgmt_n~`?aI N^O!D%,VJ:VrřЮvZ͟v3q; iZ6İ&k&z"tWg@QV Wg+ν^i* W2GRV#r$K?{J?DKQOD\Mԑµ/_DT|*<CTVU&UH`&r)/걙vj#lҰג)v)K&ޤ+Ʃ&Ko]KJ()j9[E4@dEjAhh(R׼ 0h]kA170|x4C$a\8D1`H/  P(+`2@܋DbdU)`,"}1, QWb… 缴5jQY4h3tFy+gD`JlH'ϓM4aYa5 Ny0}lM8px #4$A <2}(iD c/Ք @ƺdbԋQ/BshрТ,e<9hТ YB8 w8cꗟsm+*fܪaD dT'@\^=8Bb`+0VR}ȵoRxf|}@ӓO͑-e+*7"4I@=m|a)„8{jς4$SsW T-Ud8eTn߮ ]ABa&*sHA*Ч!-T ȈxЍu`C%)~W}'~s6'告;l-W]|]f.oG!dFn6U] \]x8k;п6Y",2J1_ '}$L0E.ѐP9,Hu0k2fϘ'urd \/;bӴjGlpcO>92Li1=pLO}zv)(YTV٥3 ³:W9¹(NkM (S42[,x{8Y~̶QSr1d M2ư*EZ,^;#Ũhz_TZk͔^`5\1bq0(Nk?y[עuv5W~sbw"϶;66LGL4uxݘ𳹾G,exO|Pcъ4dև(Sh-cYĂ +jVٵtw#iWTJ}`$[!ISFu[ 2mJMn݊KRUDE~uS$|R RT.BϺ%Z14䍫hGDN5@eҎ*KehbM\t\QssRr8swMj-3'q415X|d  r9iej.MVH@dd@?J:B$UD~zeyiy0>IB:=С\nǠPΒ-4KZ/t5I>=wcU{!qn |R(4C~$1y~<5Ic<).; J1T~/4%_~ةy+˽XȄ-|Y+8A M̂ך(jFwCL9Ew&ȆD1^#Oސ|:F8R RgK !Xnq@0[AB0q-)N~>n Z ZR RT붠Zͺӯ%Z14䍫hN1uuV[))SFu[rak9u+MnchWюN%RɱaAϴS?S Д(ϥ#sKGf,i<\:rґī^=H#%9_@CR/ASc$ |Ԝz?.~{&u>gOHK+ F$(e^U9=(s0rKXaL̞+ Y"Y倹/p[L02'Jrw}ɖdڒ~%b*~UC(=z&j0@u"pkGfmOiGb%V^-D,.7e콛-7%e&soV_&WEU^~ۯ?, ?e.3~)'AzX잳+rЈ'Yec׋(_'*s~WrA+8Rio\GqNXU%(2І}̥R D[@Tma΅ B}6Ftevg@-5lO`#i)Х|Z՗=YKOQKi)/oBKI4Ҝ"wޫдuFw?\rH7I>%3H Oo6sI-)w# ~eyY^Uo6Ѫ=O֩ $7܇E1/?$ϯ~P!,4"E⾄P{XG/({tHoR+~tƫSaRO&'X,&\@(C0E26o=H!Ok]*pTRMx.-X 7sH߾$=ukwMǹ'.4]bIN˼, \&/͍P4ebv/g(Lm `uwo7*͵w%M t<\5N rxS ~ AFZY XW)y %1*mT*-sQ7z`5xv{ዳl jm-b io62`96Gy]1%Ky_HY,щ<ڌY1"rml3M=Dӆ:w>z,dDQ*HY^ܑє (EfN+]=M- 7HڤoG1&_!@k 'ҴwZŘXȂPl%\b9Dg(}}ԛ/=/&lX];SkmktWN%xidu:_ujxrjWak4? Yf>~cm/p[tޕ}DLM~Ž3xҊ#V!(UK2o'ggrTK94~*J9\9%|r A}hK;dX  "$eƕܘBplm'ܵ yOS}٤Π|RqZZQ bg-=E-UgJ|ǡOQ}٤Z3Rĵ#P#Qh)8-V'N9"*h5R@6g[dΉT ,-H4.Td4`\:.8$,R(%mj< u8E*AzkKE} ?VCTKIDuN[K hd< -}&ՠh=m-U*NKKYRⴴZ:eRZҧlR-,n2"j%AНeM6M-4$zAh*x.ΗoB[Z9KDV?˩^AN&<` U^BK1q1_{$ Jݟ$}`E+nEW#䳫$+P[G㴴9'y'jv$7ysZQ T;;kIji4e-J R;DA{5 J,pNBBgj,SnTS$DĎQlR:U9EhYFGQYgEuH=kIk)8-O/=Mejg[zZ0NK &8 -Ujdⴵ9NKz{Z,vNZKud^Չ~?恼^Q ۓ]nVDbKm;je]"0[`$ -+a+nK은PLy`Fʑ[/Z$RRC$`&EFFxȶf=3,u]lM, zQ71H =˦lr8q?U|K"0aj7#t2S.uT  mcdg>@,TPȄ4ia$;J,b35PGf*|0L(nlXe@K) Kv:c'RcAjɥY.@lF5fTGh0Pi/Q|FFj)hky ;vX@j`@8؃\v?37˧Л | y8z$cM+2d18fz#suS[0QZ>7gGCq@ovlSA$ "=(nC037Cl hzCJ1'7Zل9T(%J3(Čz_J7yi")lbk.0cgNadZ)6\/4:= ס$/ĩXN PN+iyEPʛ/m)-r%[{H-gnkJJUēPdbnTm˶=SЬjДb@vb0 &{'SGnHZ er!۴ݸ׍e`j:RٮUk3Q zg0͜!Me dsVSBCzh+vs'8 y0 %=0GnZǠLp j[mvK'o&5W]avϮ/^v|]M4X,^UW.lڲz5lЊ함G7*rj*Php\-hrao,?-tī#d-)}F2n'H04n^C }xS`?m)-8"ю.]JGz{G1am`my.0Σz(cN&B7|I6;x6MyܛBR>pf~f` o݇zƩFqGaA5=FmAle^e:{g4Hiht?U fl䝂]VbW f#ϯv9xӵ'TKa'x)<Dyp4ojD{ی'S"ZqCC^BNbuaQC[±KFhI%ˇo>ъh\?j|vL!3dP {OILvuVy]W4HVK XIXJMF~gVjijl:Ɇ@CVKF,ʝъͮ nm)5UX!",gLӜLJd-2͊ply|m#ӭsS`oφIS3b MęP&5+lIN[ -=^b9 "=b+aXt74SgW~ǹ g;\l+qowUE>Z{TI= mQνrޜ?q~_^l imqCuftfLq<>Կ}]5ϕЇZjȘLa$6BJIKT p#2҂d KeL ӿ-ӑ)FLܺL > WmtIWPKJj~H F $=Gh1ܼvۈ=N붕F|Ӱޛ=Z @8us@yϋ噡?{,Klm{-]}]čK ϕHֺIJ.Im#l9N.3x*Q)%+ƵBm3j 6Q QbC뺶*wc$zy.ԛI)Rzk/53~<څ6$fOK8֔~ sPdPT8τNώO'ZEY$ޞ9uE"ZH%q3IXX_Xӗߝ (MQyX`)θE;[Ky]u ⒷGT[~:R@[WǩWxM/ .B#%Jq5f@0ݾzQ?\=/[5#d+Sg/?g.׿ߜ6wM&d1K'&W5RcGfu .kjwFqi9iɄQFeZH9($,JFRnjh$ !D1nTmk(:.EiJd<#)D^j@YˎgI = nO?!d\'/hD";5Q"d %! =bI@d D^Ž3%Mc `*Ƴ-]t]ŗ۟~(J!X9"`Ij7B`H9UMy!E5@C#hlFd|)DgGpRR⏰d\mV ]_n5#rbAA[%]١$y̤ T+jpx"g ^4Q3].GxJ Q1,жrB&+i+:qvƼELTtb7.tj>J9㏾eqܤޙ7}CJD]&Y7L-%V<=LOуF1@F i:a6G)BGA8-  0ĘaB΍9u#ڂ?Lq-,"uEU7!QխlR&eKa+&qeFHprT c5Um9V7+jz$H, g_CqÌ7? bV֍lh5D@PNLm17D+{>Sf۩%xNmp3drQj8'/c5 )EXaS,J4ײq`cE V^f18Z&G TdV2F)zUwtL/42y{`Lk|zEcbI0_Q-\4cT~݄j|I"xDc%ɏ4:8Atؙ]\"Db]K(mDk7أa8< ZFiitT*"(8=v_MCNޛzG/XN>,88@1};ָ,ѢM3(fEb\֌Ƒ(ʉ'vq;ծZ"jW?U!)R楿G׼l\%A4<|e9%M & GQBZ>8St&_Lﳕ" g %Ռ̕ȏ;"}=Q_ N&Kx2ۈS@@ITgS\+6jd.Nr8s|zCԞ3r0@W]Zkhsammݍٮ}Cfnuc_NzMxZGe$[yrX >nn7?G|:=D Y9%l+OkWn7t3zv^i%WnY6%2vaiF+}GwPKBi׫kIP^~M|  >$Nuo[PU;>i}TvfAVMQȣSd :yBp%d٢k2-ᢠBȚȝ,"MES;]jE1l; iMyeBc(ue(빶XȃFU@(ϟ1Oys^XZz*v. *=DtkNad$iMr-e z@%JS ?)H1pr(g*挆9 'Fͥ7#AG==ɀ DP(fRp32آKd@, GEP 4LIhЕ Z ƞB jyм6qYbb2 }'rMKmM@2a: Z1qwtmI"t$}zr-&c( hFC [84A"9`[P"/E#0GtU<ƞ "YoܒRChfO &9|שlu?fO a!Dsl ^!Ȼ1ޢzNtX 7HBv:CebE `!Dsl?q{7Q+}Gw3`Ȧ[݆Wn6"maI䭦p7K @ocJaպY5VͽbOѐ$qxxUvO?uve|.F%{Oɞk[=H0ݪ9[ܭ?ooT?i92UEϵB~G /ޜ?UcTwwFŵ+B4(^J {w۰?ۓ"{J5W=>NP(m* Gx#/zнfM}]H]ȍ.XܛG|o| ~Xgss%(hŕ[n}]U߻UPcbůnrij?QZSkOuw=sfЃ}ɑ⭴Sӗb|/HL,P_%,:WO,DJzMM!S)3qB-,s;)R!Xngµ2CB摒9W> @h ㉿^ Vaz1uhfye> vc|wc%w> @}w w[?c֔v*QޠkWA7v3ŭ)0ʺ?9zÇ3|nL'Y]_Zoq)_ţ wmY`["7 E/ &{ci.{翇dօT3XU"uP1]T+WF@=@8\g紴cԬ Cp*! {!Q7<1ޜT>Dk^@J܌Xy>~Ub7|~1(Y RTrS2Xc,V;JVkhK#&wn;PNmOۖ!w(!):+3k[*nD:ҙ_  mC͔E鸔JtX@%TNFVt)gTq `<]k ۽ Lq|ptdOq(TU;>Md݈ɒ;_JiY̳D)d1ZJnRHvX'4\MSm'vx"@\|ាM P r}%㐕tr?hNrf {r:^HS:IVcYKGaB&;0bɩېE!iж9$XG_˖d,Ovv&'waN% M4x(ė9_r$e\IO, SN2p^P Z)Lg9=򆝩4,u|nվVa>Eg"~~n'! @>RQV<+x; &󨺊Sx(OR* )"_@jup[0^nb ״jMQw<"(U$6:M_"( 50Кk&d@7O'cd0!vj@%% R쬺Jj[^9EYPF@0XJJ(!boR\(QFУ-;dD<"eӸIn5]'{ƘsP"P9PՅ9A '_.z ;"yՓ;m<*AX.n/ueMuUqKM< = ^;\NyC{2jAL_![ lp@ LO'&Pǐb"j-w:#Jm/#ܾOY)ڊbUi*̊vʕ-+k`6gEcCw3JMhx[TBU&/IRƬXT~zL]Q) $Z2^Q(Lk[c5fW-]`e✣Ǔqqh)eib!(F*jg^.%Zaլh)j+bMe94L0Z.<=WLZ|$Qؾ)rރ(w #ʲD#R8*Axڅ+)eVZҔCoHThk5WMbe35Gw;y:vʄf0Xہ4s;hErz{~ +e=vej,%ަu[C p>\z:9tL  J;59qq 1P'/2tz;6Wz{ʆ7B(DI;;wdQ vv&tNƠzc&cly1h# >`+|C_úD/V pƇ_*#K}hJ%djt}a oKWKbx=P9#\+Ex}nբ|~bwCاo"[x EX/Dސw1eۻwoBJ!!ּ8B*EXA@XJ ˈ+Ut嫸͝o^/?v%g 6`GzmwW{' j1%IvST~^ ty;f~C/Z4@ Tx6=mLj9KöRЕ 2N!kCd ڃsU W~D7psP0j-^m)?]J;jEH1;y2ъ;ZKtUvG[vz)!AAH:/nyD9%l1& ,X尶}| Ŗ+{(HY .'35?V2I=OuſE6[D`ǣO)rǰ^N^{N.2W<"B]cuIыEkB5/ V=&/YL*QJˉR3\h~SA r..uro F+pmBL8 ukbEEaOQacc NEkWcJ)&vY EJ-"QJŌI(NRЊWEN ) g NIhRqV K<uD- r$,3VjQiPPrfQܛ62 ` <1˰E9ifE#LX#kE)Ia7M P^tRv} FYH)eiRDy͏{)J)iRDԥCo^]:q)E&ܗJFBJQIu =H餥'K^CJytxmrI)^̸,F<1ުYjowoF{DK=AĬ2NF$X` I^O"+]5<$i]O 0)*[Vކ҈J*e\k+5?rΡX.LQ-KUg|c2[JV`8DΒJ\xt ޜ 6&YmctH `>f%ގJɔ YRT5"NRHꥐX*'9ñ}h"i,tIh3܅QNHf%/I%m$':T\R.brIANSڒ)GHD=>eՅS1E+1Mɕ:^$Ӊ۾zzI涯`t۞Q Z^?2bmzZyζa*3u?*- ]+-pě7oykuܱKע\UO=rS3@Ȃjp^JQ,C9fF)p~2tM@ :wrq审FVC)՜Gz']9*tLV8+!%3U~n RFj^gꑇ[:wEQyW-_h -Ux~.F4EƉwD1BV ϟeI̞1&(z.jZg˹Xa9Ƶ$"bF8vold]ҡb^߸ =q ,{DޒWzfol䝷XCkz_taKG`S<J)Ҧ"v&.0'SICP>.*+eŧ} $tLQץR^t<T?y:(*AԕPW2-|gƩAɚ>LqNW63O%b",a ne VZ|6̑*@."+R1M};(ݭE$r~\;ǂB3Pr*JY|^2oQpۄ~:r S"TC?~m׼ؘzSet4RT-W?bu%bn5Z*en>ݖ~~U`b7E]cto(n NK:U0XY*- 3dHP=rk,](k/IM~Z%"VnhہV[0_|{~9Z?,c^->f[z𓥘XL9DžjQ HԹBhEE588jNyQݙ!YOwJb̞1&}n$:MBq'qWH LD8_ow[s+|+P/kix)GVgu¼(+ uEޯk@UX"B(e禥g*!0X_BϑZg#.zVOmzNmֳQ,*d|@׺w;앲IfJ#RzVew&kA\3P*jQ S(w  BԕL͗?=fS^soG`Z 󉽓.^ۧ-Yc2S^z̠0`u- m1JO3C6L`m`O)zRoyJM{k˞7gZxH_.W7 $]vA!&b5؊?eQ a`<'왞5_J`4u%2p֣䴙)q@6KRoUTؗxlvuF+˳k^Z hl%xC1!5G*0[„#-~$_'ms(:%j-"@.7R^y֙X3^rZY87ת-U;]HB[dKJgpfaXhW4|gP",$Ikl. Z5񳛥3NV3qiݛ'pQG 0jq^?~ި/ \#mHTΝV&?݋z[w6,kC{1Q <A[I /!M9K-:#T K-GRֿnn LWQ-sl f8b g/a {&1*u(}2EHB1hTKS$6i2Ts4.u-u2ʀNɮNk Ɨ [O IHM˜aڃ,"{{Rm4L9CAAPä1{FH0@h,B FF-A`tE£~]\=B6)a)sqܵaCcVC5Y`t00覩VF  t%bX*F%e8ܨ~?kdnr E&Ӓ b\1q /SH1Rf,^Y2iVgfĜ"Qg؊]HLMwq&=pCmک0pu6ȻqKo*4F`g|X࿢WN/o誘Qv5aj'|?ΑQi%pPyy3tiV[w@30xo9T.27ޜA#s9󺾕0@ 3;I'ngiw4{Rᚁwf:qLR#bU Xle ?wlc͠qEt3q1ŵ$ia@Em\ݹ$Np'wrABAe LV]pzmgZ[a$~ v맣ý7GVx\C+G%.tɤU=>0uu_b [_ƕ|n?:{~cZL>NY/\||wӿɘWo^M{+7?\mSH ]]S4M69oʴd dS  |E hjgW05:VFꔺ1as%兎QU+ې6f!tݛ/x+lGAoC>'L+L;_8\˳T] G0.A\Zٽpqp˶iE>Ng.b;8ss_/?g[a;ư? [ +vf%w{4e н n ӆrX3tiԟ)Beo8=runG0EZv *c&^&΋{'1B|_y?g}nWP|xnwN8{ϩiB^k{Yq u0ywivݳ:ehK6J^"s 1jPΦދYķӖ}>0Ɵ`Zl3ܷEyqhV٬;]'@0-_Ϧld\.{HnD[%.r=?7jW %cAߝi_¼.'_Ae(3+(|Fmvr8ej6ĨY(|g5ЇzɎ׾%b/;?|,)s~_xޞU^uߛ~(e!ӛ@Wi`Ni7ͱk ҵf⓻=/lALD;R=5L&F 9[qVĽ'qoފqjx l'1""eUrp$<Řh+57>z]S]#M>!5зF+>]F83j 1tkkY6\`r%waTzaT]z|znEV%mL6cnOk2 L+<xR uBcE?mlH&r8S'o\Ρ}exUbot:WXFv(;+V-ug^ZG$T{,~p۽I])` :.,g:fu;N\4Z-̣|SudeH$+:eRsj_SjYS"Pg3WF]ҵ!0\g7!E[ְ 1fu2yr27ˣluډ~Lw<tfy)g }wnjfo/A0RWqɘMg>{ 7wU!Vg%AQwrzo8(@Xʎ & D0U 3 OI(zROBpIkۃ$b>(0R)p¸N#JcZz`8'2gz{a,*ȇL=7%cs))y-d,n+Cpb7 cI) ? ۄ)H+u˜72Hʀ=.UMe/O J~b٥ >J<sba !+qU'l:"%>FڗOnU^: ЪQoLqˋ9@ 6A8.+6r5%웩oE8K*}u$]to&ЕKf\8xP7e`kEZfKHVe'mTaxE4[N$`JwVKd9R[R^;P-98Ƌy:;&{p+ W6BXjh59Tμ6"hb^}B.R3+D |-eܝS[hW~Kȼ?՘oO}Y1 `Ёx }0R!ז)4NJ`2" #^^/x^ewIz-Ӵ[%HVzY}}=2d/qe }2pbS.eb:5"f(40ΥA H-G4ZM1!j7+?$ p~)1 rwwɰobHymFqF0Vp|p8`fD{zM Ud >VLv$"A4;^K[ṗ:şBg _s};ӁE`" >{vo&J(xJ[ i$V/-7-i}Ӿ`֋GOCR" 8JSX}* ItSRz*jpT%KhH7[8Y yL/8u#Wah+-T%IK(ъ4ԭUh"&{Z WVO "G2:MS%Մ 1rVRVeekP裠{\:ŏ aqY),k 'MfHa҄8/l/+ٌqϬG/j]/+@d@zyx 8adžV43ڠH _Z\^Pv\THRJrBGY$f55^YyD"(?CJxLMǔ{:T?HřTS+sm>,}iR.Y#@9]҂LSTdYǷ.<|[p; 1)e*}? &yL(Q0iJ'8Ч?xLYQKT*;nށ:F?|[Ct,'^\\("?Лѷ5(=$}7͜n-fuyh4|t;7na`k܄FhTބ&yl&yl6k0HЎ$oD Vh Ӡ $j O].5 u2ᮍ+4a6N9Sa.D".~ȩZ-B i(RCTy03P<KK2񿍈)鳫OP \ȕT D;%Q@BRdUV442%ʱ40vdj!A,r]mRS$@]NB0Ֆk(rr|GJ39kԤzJk6qR̿VPh+^Oq]N=p|gc1 ke"k:3+ow7>%] ?EfRBX$yەĺ6 JTӲ[zt!ߺqiP!E"v >%֨hib p"<10iF!3߄⦠, A@ suA=/)IqjlS]P 7)F@*e1(.iYK銈r%T=[QGb2ѫQm0TJ[#*aҭ_z9.bXnfGqi /K$ b-K|4SnzcihIqē꽹l!7Bl8-tk~]D4%UO,t'6E[3lf.ZxtXeG-4/{|j=n|y经}[BU(oU~.w kR7}}նlRo||-mo_gN o[Tj1* ˵oϦݼ(8GWRs״uc3?Y]ؙtmc.yЛrn|vv<(eJ$>nԋ)^!{[O|(f U7{znl[n@лuŠtƻ0kڛwRևp]),){8oλY$:Jh4 5Ny"ՁziDE"4` ߯Jfti$z3Wֲ'1#a)EJHet C?[ҖxVOI<2unjns)*Hs.׹q$muVh_]O7~R~٤/Zy%2;BO&􏱌TYp(w-z?ϟ桤U}B+ctҙEcy0Kj2DuOI)׺`&&֛W_2>nuŃ#1m[ hnOmyE)i:DٳB ePql#DAIZJ7oؗ(PV"yZB2ccrO ]X r,@7rhԅ 3 5FfMiZT[ Rm נ(! #ԅP=])=]\&P\>V˝P8҈ݤA<`r, JRi04y,$9yźzGѻ6MŇp=;TH^Y74J#rC$&;l1 Dlzg=⃖Kt]~x[սH_$ld=RCWDE`jܫE+\>mk:X*UqXmi$94%mpn< pE~iIqOɯv/|\u!fd`WU]8Dk),/_(*Ocen\|px {kdkdkdk4IAɸ ՆC W؟1ӫzzSL9b߫'`sb׀ԗ@]2 .%VAWhTvsMH:ܲ pi-1+hض!@rj5蝁lr2|*.>M2l ew+V&٭L[4J(;*EAAjnDŘ=A-!pOq⊏wmQ<4ɬrvL܋"?PR RVA*t`bR2Dj+D@(1)2QOKgT'd+bvs/ӊx"󊝑OɲdF9+HνЊ4PAp!ZU(^HGM1k \H50)EǍK'A&3/b8Eh=g8b8e?%* ,QCzX^턑Ƃ1JZc kɔ Gmd- kX&l3ĝlKKV@5YÖbƔ(%O+C GH^ѱ8}-Yt ) d[-,(&/_HK*5 zp A\ nf?+['7 ӹwlO}HfKZ6ljr9!/}e5'X<=w$l'j~ukoeQň޳_?p|~ɻD컋̭sY-N093IFl/?۳i7\ٗ߹Y@uz> &)C=z#ɘ2J*Tgmȯ_䦗2) _gj㸑_\CEUغ+E'5/jcK.]9=; ,UW'4n4Fbыl.{,HL0z<J*M[ ZD4j{tĎrx1,q dٗoAhSA8CZanlj°QyPy(#* u Qۼa'<7gb8R۔ɖo8 ?}-Kƚ0qfx D ُ{EP(nH?pHw[RK ~ DH wpPٯ3cEK_)w ,m9B!&8R;js<& :ju .6~~ "$IFJH#stE< 1%@Diğ}-Ҭg%k,l, (Ȟt5z֕wWDZNpظj|%z 8ӫ ;J 1N kϙ&@!A\v,AXAM -4)Ia>pb5TRchdp!+YsA 5(Гl񹹡(UrrqB6ڵ Iumֺ6 Rb*6R!4F3c֐HG:E[7lʛIOzӲ[a֎nJ2*1D[݊/Yz/1g cr -] 9SdS}Bw!b"[GbG|19 N ȚV K"KP`UQ# Q`֊*)0OVǩ[RCQjq 7wC)B, TR!Tkk"Fr*!S5sQ&R3B& % QE{!f%YQDf TO'weyd,6aYἬ>4!-яJJZҘՒOo6?|?z.n~jf͏Va6OǗTA:(]{O^׫]`6+oԼ-O y&eS^]o[ Pnؘ7\ۀ{a`l=P[B^vX*TQl25IYtu\-.)dڬ~2co8Ul\/~'?yXٛ>]x?X́X́}`mMKl\;oLJ @cR~Qy$l"rj.Ћ!ngهdm=x yG2`' +Gmh'=ḽ)LQ6@͢lP$ޜ4[ߚY |_2VF0e,V(~hK%/a{VN)-bt6CfuOlnxdz&1VyZfJִ*1JZd(`rP %8M{m2$ ³HAPܛeƢ]U2c}{8 ER9-Ĕp|`y9c[9{Wn1B;iݨ#zc"]k_ 6Iß@{!mGZaܻwJe*AHlk]hkO<ӻȹ *2g+Aa8d8ٺ=8yzLl|<ߩ,n'~8SaC>#ZKǷnru;KX0yV KmAbvbtc {K(9x[!ICE7Vh£NhKsy--IEbVLL^Yn?EЩ8-3 :#.2Q0'D`~ hڈ ͼ7hSY23p#k<\4~&*8~*]"Ѿg=ԤBs%GBBOGT8"#*{j'BA!*.s#DD)nF:,ds`P/)"9Bidy!jp)NjXcП*#8)UJR 5.*$^SS{R/)uQ[!28,1vNV{LVwS[+7YsKdtDDdm\{\m5YFVT:7aF ϣ- ތY?uoݨb|;PrÃa\KcME69pM$ՙ4TѝrhGw`b;&*$Mμ;(g(@۽|85vo#'P+?̻ɎkMLp:?|^,US}yins DD!j I RSKR5V˳+f;sioʿ~kQ=܍&;f+x/.J5g(-}D5&C )+U珷q5>kqkk)>cxDKbzu~y!ÂެiYdKp@?ѮD?4n'7H{glB|lMǷ& [Mzm~(_&S]яr3Ԭ]No&XOWܟ]v~o/V7h7?Br37)] -Uz=~2A6ҋ]󳮵0EOtM.YdE׶,gaJfKrgJWn^6%OnnŁ4wt|݆KXGSb{7zM˦hǚ?*(d|mh~o(ApUy.(*(h D@WJjP$`Xr g^|G~SְQ[9!u8oupI*Z_iօWn>6E[mu|11ox!Y.[Q{-rnSBrН?ݘtrE*e_>oЌԊfZ{P)hq2^+}bG\n~kn~{X/{qJ4UGc9$@s ݉rz8`UܘVGb݈չ #i;=αK s qc"!(RWS5p#{'(iu屖s 44|{Xja792A hQ=880C!Ep ?-9py)Dzbq0zJX1rWy0,<,.[,L}lW8/9 I+Y"%+1W9c(G!U* $SvR䷍|.zxs.|(އ:B|qy`18k%,@g _`ϡ6' !pPdU @Y3`UJŀJ5x%K!)k*PD "0M鬑]gGKs|茨b}֕OwܣqhD6EMpoZ7kB5ށh{FW7 /?W2nl˸gO-d/Tﭹ`V2=YdU ʪHb!JJVce#Fd}wrS0L47srH]_?x{[e iwxvo gHf. %lb(C"pd`RZ%'BrHFe0 ĵJ1B axܪaw)mBHvcen-KdFHt\3Q+(+:9/9(+¡r:#YQ:XIpp!歇1A$@U`KĄAdnV:8L{gf]/!N+-)ltQLn¾t@ ;u7eKLX)\dɨj5q ?s!v'.OlJ *j W oD+>/ X "xkA[ _çGٓzl5qoȵA]^hZ@_N%HX54^t N@|"1TL-C#ȋޥHi L"/8V(8Ηz/=aQSCF!xgGII4yo xo^ll=kQ<%e ˂W3P dW\TJ K@SRٻ6n%WXz9c4P%VR=qKR* KRvT6f(i8$%̅7."qi|ht7?bPdb;}NUܩbn)3O pZK?S9LWfuV[(jTҙBऎU.|>=ɵ^QݮQsl`U;?I #2I(U6x%CBiR{Kgw rV ~jҦik{cV4~褁"G;w"'bNF)UM8]8"UAt`5eQs\FqYtr;.6WJ *t)%;.S-< rҹenaS͛s-o>e}_t>zS] o1uBtDU\^b K~bO8R{_V{:d%M=ʏ09戋cЁʠ,n2¦:q: ۍp3Pb i=7ADV)rLuWe\*:g n2)h^ 9[ޒ\xVMxesV dsjrG T@_.<*F-Pfů@Od7(_ r=EEj?9:\)e A޿P9M1/vfSt:2e%s=W,zLB#g/jcg37"z9Wjxv>P-Kb14_w Ռ=^{Q(g!53m?RNqfЬER^׌Dqfkx4 $c?fhb!OS}ѤY.'3#ZfJ+5 ](Ob^)]FteTJuT\{ASmQYea Z%AjG9Ɂ/BcTd\ANy6M/C]hWsx鉽z; ׋dB"1Vu9xb$E}{Jr8ҌSF#SQj\cEa5c /lP&0TP4q rtmKx)хq.S@8B]^ū#Wѝůwcx- L xuY5B Y9WP~:Z;B嘰q]я?|d:/X 5o_Ce͹lteg'7ٻ'ǃ̐N+ .,ƀnhwscʼn5uID+wݠRŏLi-ϳ&4Xo_<,yKZ>Cu7 MN%AQiW1")+69וMJګe>Ct YDHiSߺ N_~Yww0}pֳuxjN{lMMdde3kP9훉'nX{MԮC%?r{3%MS]g.6sqw@Zhjyp~-.}^k\%@hR׾=$fHzL+ziTP $Kƿܕ[ d!Xi*;*`qt@p9x@jK16@g•0ڄ=yLj=}7{l.̑$5n]U4C] JlG~<](>p\c{>- q˱]-k#e-h)bێNבY5ĕD[uQܳ222ξz E<Jflq^J&ݯHPRJЋ{ғ/UzmC>(zRbg{:aRWtwpwNI wec3CZ>EEj@zu,**ѭjʝgqRA/,ʆp.ZYM<.5@Ta<ﱪ 6MW dE( X râ+A3&9TV40 3tG,I,0%g4]=vL%4Tis5֡ HM%35 E*5˃5WpN;.EdUhrH4|miҎ0-M)]Ǥ*Y9W5)i[\a${흤a`K F#||{Eo%XMu9&/}L3t睰6$+#:L$E |͎6k!wV RZ}3hhZ &UO/jҶE-\épµᨑktkfDZ_Ioi%-4|5c$Pu^r.{K9&Z 굏,.cD{O KxeUV 6% d> Rb_F[$lxq3 !Gl2y}C̮.9tWNz;Nh&9iQ7aP?}pͷ~i?HR\spI(khI#HFډl̒ij=B "˂Ķ*Xc9 9'L-|Ssp^E;+59 FV !GrZ1aRP̋@/ ®`BF@ N? -j?)( 灒y &8}yBB %+&]Af] e%2 ÂX9 9gG|!LC\'`(qCۼ+ux4^~;c6+g`q۲ 6lQPR+I++׾T292ɴhS lB%# ՠ[=FAQX( 2QdL`>S*bj#N};=\.mPڶ\n*w^}`yԃ Mz]5߅x~H-?j=EmD?v)ix;ͧ D}fwWjM*HOLop!M\"RZ?*WTOC 9Ui&jkeq2F2̀Z=mHk>ΐK&vA6k--4; ]a´U}`f vVȿ#C^>4;>8@h)b=/^iԢBÛz!<3~AKnzwM;ZTOE+jvm:ZGDo Cp`am\vΛ^g-߳fwKNb=& 2ӹ% ЧǭduM}VQ-ĭŷ3ՔkQ;;ո~èm5 ,v@,Wt;;(ykbjbpbk;ڴCڟ><tg=퐆)~^n96mvT[D6EL1uvMb":}Fϋ`]ipm hghk~cd:F.^v$"#'qVh2d$C2Rh4)#4J"HRBEiĨ@ -b<T]yNQ}NsY@m'7Ծw۱M)JtL@Pz|QhRwaКSb-֝kI(=עvK;*[DR} S^~ ,D%,>,!kc٠ʷ^%qC> p%^W- &/b zK8or#o&5n~J⚶J׏AoD ~nQ$3;o=:\mhKj׼sS6먽1 gbllD;j( O~Ztֶڶ $ɵf'uȖxӍ71JtvAwo6H0 lJI$x9")Iђ. 23/5މtYޯ_X",FItMBc9ifݤr-AL^ԒE-Ié5cԌ!D"GEP0=xNKV4|;K;9Tz>@zhRbh]VoFu!ZJ6\J3̠O̊עbåw$X%Z }*@̸,y*_%X޵2KN~e'Y/Cm,&刔b҈3e†eꯪ]K Rް@ H<J)$$~l\S3[0Ni6ygX8Պ}"P/i,c\J Ǽ :{0q!*b>zȓ.) \}1s|\ o8 :jgv}<ͲfbVMN{ʁ;;z6^^b"a &\ί%E~l&g/./{18hwEnw @rwK~Kpz oq!YSAiayNS/RܼvP{B@Ӷrҙs5P e?բr[o&ɆM1'o0 F^k>z..^k>|`(G̥[}F(3YܨY,D8b]%Հdp{ Ҳ\G{a&9=p.̛x=Glp"8*DJ8:A&Z_i+Tr\8-LPJGNR ia/f9rOć$H&1XPu/ {lTFpPDBZ`n7&] v'Mw['@w^睅! VJ &uޙ#1eTkAIw$Jxb@8(γxDt7S#c8|U{DᣃxgĤ,ŕnsAL]0:` m%Ad'4S]CXs 9 o?7T fo bO/ bT &Ԃ9mDNDv|/Qt(E[qZw?j*QExv&y00 _t->PxJQuV`Hp,D]LQOr IV^M10)-cQ\6_]ko }k>53%Ïj_? ̐{cV_K{ ja9`0h4%NKZP%~KI7 ^~@?-=7u׭ N/`㝏rt2C"W)uPgPz,y7C2hhcЎI+cJACt"\g^JK3uJ v|2ޟJ*M6 V J4ElI65 >/ %I\SÂ-cT%v /ݞj(%0TG3U7tΈWձ#oxJAR ܕ~iQ˽X蟳??u52,VWK4V(+űta#dӷ+Or~)W(W(W(W\-w96ZZHHaE#SPJP !5ߠlp"D{~fWJhm0+ j⬔aV%4I?`͝|PaB8pA9-TQPkT]PEgmT[lΎSK_7#GG asA,#B<ޠ#Nn83zt>}MG1fC9 ˉ2>(pG1vFR^^?RW\JqGQ3bxq$U'I*CalZi29Ȋdq>NV%b_f67XwtYh?xkᵻ4(S*041KAH`^*?K(Nw;U"KDZQxF  H8Fl+\ӳ jh8x|eU>&3#=5ʶΧΧW71yEe/l;JR//wY!fTs+̤J;Ѥ <ͭjA 1(ҮQc~kk=wPX:McUrG5HV,CR;1+4Vӝ #QI;EMD; rG;?Qé9m9JauXjFa:VlykwN=56P-=XJM{4TdFԓ; kЌݵN~քZ~s;bb5RZlΆJ&?FrZoFFQt #nkb{zȃhhM>iq=@!L+Ý7OݘKː\䌐\ ˈ*ۄm,;ڥA^ɬrޱŠ*f$rEeeMQ zՀyR Ǎ Lj"I20nK y;&EvAYf;EMW~rg^"׆)5(1͌WN[@GS^:jf]7v||@R]OK2uXh*BPw|X|6O&r#i)ldaE|dRm$-C*G<7t+ilt>/ 5JRܼ-cU }IbH}<᜜jXSV8"BND}$iO?+I"?.<(ng_sx ꓑyI Ġw߶+' -]ˠP{G‘.GUhl$?CIsc4I+Nd'H:-88O&ȀeNY Xb% X{˓& v*Hvˬ+"yQkysҩ֘6i,yf“99w*}9s ,%'2/]oxۀf?m~Kc;=s.K_Ȏ̝sn }X?u7Q[@zgB:}1b WD%#RRoXk7ƋtiVE8q/j"JӼقF}1bu CMa$F-w(;'#Jl`P{W1&Y?ɢ%%os,DVJ11i*MΎ,k~-1n:1˱yZ50{zzv&Oʜ7v<Ͼoa6>,۫V~{04D:}poU8o_/{Vww׻͙'uIlE Fj ÿ9#,0ATP/H&wLJyqH3/ =Rxh.vR,),RX8$3 )HUDK4P h$v3؂bz Nڎ:ӎ +{7nı ɛ4ZҤy'mҀ"@i)e ĩo-uѶ1L&`Zb`|d`U^o a?ݍ/['R+/m#сuiy`χP'~)o&竆pD`8r$d_H!9UN<;og!Lg1Dt\w p &l|S- ?7.\|ɹP= iVS6[+xaP^fK0[:̏$H-O6nΏ/];]5}M /FQ(,?}xw {5]yAy bXzV ݆.޼ qXp 4BrF}t|w1yB/o LI5Mݘ[i)kT0QBqvWr% T^D^q~w_7f`?mP&I0T($՘ lp@*f"w1w߬k1P@e_ƓIlayd\gOF)7 G@:3٧Х 3뫖xz*c>ʹ hlل1:n#Rv:NwtrJӿB0Uݎ9\k\b"͹,e^/QMQVZ L'c0ɖ[(#~EuQ_GUuXw9v#R GׂyM= ) J xBJavbSoKta62h#x > h cf밪Ǿ{D]%=ZU(FNF1r2Q5rR]U0imRY9UK9*&A0N3xRhU5|"Yt9}0v.m( sK4H i?{WƑ /JE!j׎d?8uJ2@JcOoIDn6@v_YYYJ`ɦ LҤ@J&-^KA8PbgbHj 1&&JXb]`Z i/nl4g=!n~8]9@ و<='WQ]SDuEtNS]hÈZ X/V/gcٜi.8q C@@)Bɗ 5o0,Vŕ?г7B h@"Mi`ք`Ha^%@ Jөf>7k73'ңha)FQ%C'1A+kc&t41"[vYkz+= F\:b_O5R[h0yXY\GƸTMnu W'f _>ӃwA]nx(3?„cJIhe )!>}N㗳%)Կ:ţ/p,!m%&u;栤!L٭9`OQIj]%0-e,H `X;BH+bzFZk.a_oBEe%PC;DɊF (t~;Ul)ӡ y>m N#WO'Fu{2hS-*DT0iՕT -1:7VaKUo23 ,J*̄(Fe&"E+j ,ڼ- bՒglC8X{8 /hT٨c嘧*1%KӞF[¬z'TŬTp\Ӭ&]lf2fuivMes~ Ca萰~ntnhĂ.`Uo?M%ؚ0gԄ,SArk,Se}ԿX1!}& 8ĠarlAՀ"&AmЇf|[I}Aj ZED2s0`M9Z#&zڢQㅹuryeh"M`{i#u"978öYm#g]?C(xKb^Mʒgl]V UI꽮,6\r6+ VDkѿʺ_gb7@FXC2Ю\~7ǟ>" T*wX+Bήgo-//}esw|AKt{zzWU/7qڗgW3Jő|n풲~ 3%,xgEW%<~ XV,o J5%68  uG6p;nv0~"jtvn;j/X)+00q߼=𿳟?݇nt2}pv~9˃_Gڷ^¬ԉn ?Xc?w_}W'#zr㫋ɫ_F4EuhKh<*X$DZnD>)ֱg>;{2VUA(ş=.!!`_DDHIˁ׼qζi ƻ۞%Q=N.,#Z~rd/-~ uQ*w "hDwQ")P1<{IOGL]۲~|9"T$ajfZ`L!x7:SJV~=|X`2śq~~tICF_zW޿h<ǫ!~JB^ON.>:M>55>:;3s3FwU a!Lrp^O?4-nʤ1AILb쁊9|LhZF *H&*uQ:܇>a@!XT;FU,2mLT>ZkSc{ R@(eGQ rvRLJpG/s|7ӈ6byna\tSi>~%v//tdJr{ ļ[E? -;TC9\G`~/0nԞُA2#wspFa#d0͟2sr09xk含1%\IsA8sA^к+euRB{̝%$D@LwE2.)Q6*69ϻǗx&BnJ7^aP˫! ӧaF_g.?C.? FO6(HKBf.ÏG_;LPhؓ{j˓IM HS2PCaRrq.wrq.w]uȍ*%83ԄhР}267-8̗1y$ _>ftiEq" &#G)&XT3 \b)ۂY%B)u2Hp*7VXF< :F"!yd%B9@ƔU.4 Fb3&KdN 8L (p4J& $4^1bs")C CHuQNap`$w1:bIR40BBĸ:J=e b&[CBNFvLgOU fB0.אI< )Ǵ:@mQ9We2 1[XA'E!pX@h#hRZPl`A+H׶ V Pk2r lf3%<X5j`+0-5p 8Zy . n^1sk -6JR{*9_ ӟkIfLpxu)ړO^`~:L,o$Os]*ןtb,e J p֖SU LpPZRiAJ8FaKݑ5#qVKE(~kmiAq=AR-x%XnEc!z iv]&b;ץ1 ͞B.jnp2#xmX UK=)#W[rY (9 h#etJR+VFg5ՠ K&X Ht~}tj-MN VSe֡QiE[nU~ҁ&$|H~S &HU-g LL'|o/i|~TޭT<|5/DJkKblɯB.S2>?|=6y4* N+ɂ|4)lw?UmC{/ЎX#+S7ȊIïgXmIV1~> vBCKk5ГQB¬Ig_&%(@0GIWz&H4oI10O͝]&R|SM Lg'Qs4ln"t 9) MoƉ^*hht))9`8XhPj&XYݠI5|דFZg]SXV1{6iӠMvȪ5K7bck +}Kg=m/>m:h%j5-|A'g]/0@>PU>h<^aEZ%JA=s?S틊pi{ېXLC*gZ2 g2A գ97x%8KqS=0 E'4572Z@ q;.kv ŕ wQ;Ϋ[A'I_tԴ+4wc:a.=| /Rw=mJݱܷ@tL{1RshXLFl}+Pyh5wiQsw}:*-όS yɤb7w[yky8,-8[.KӀJ/aOqI^$ Man]5腗/mK_ns| O U.>;7nE60zcnN7x3٨RoVNݺ7nWJ3SHJak2}/;on..拨m0i.ayY={4O~ዙ}VF7aMI 4JkO徊ʧLޅrLi%/1%?[9pU{q|PJD?kLruKDP, W]\dX]S c"5zw^~77O.61a}7idR4=dZ=wS=Up闃_܉]lğYh?t@G`C:2N 2K-iƄ8_n]&=,UIl6_͠k:kx."R3.dJɄ)M3 j'jYkZXc4 05SȄ!ׁHÄf޼=b#%md -=oʔKZ@ z/iB#O6a"bY+Ҋ~^]Er6_r49MS 4M1hڌAM/P0Q^0f.q"vJQ;o3IuS"ҏnz].ᢣ.Գ0*-qmW9>ȍ}Qj7U.{uobL1!m~s[ӭ٧ ,Sx|BX\!uԽwZvp{:ZWdAvߋxY: wz-A'%kj jp80`-5Rl^}Md[P9]j Z.4zӵ1A89` 1iD":"kP[@iqjeG7u(Uʀy_"P2KpѻR ({>#C)9%hpPuڸ]7CyЅq-F=XCǶwS"3zR rLm.݊ݺ7n6y'&VN8U%o[JI%='{>]l$ ]k5i1gIyw 9}nu mZraQ" nm!2Sf9!-&1%BT 9`zQKqWmy4}9_عuԢ[)qE,e*R~RW68 AoL+H2VV k_ fwVR>טƖr{XcmC|;E,e?*5]Ձnh}=D]3cZk-QAF^`%]$ZJ,GY׎Tu8J#kd^#("Pܿ8#n2i)63f#E5u0V+H\Tc&hC֜8JSٍK?TH'pO6m%i&_;Mv|k6 Tt- *W5j$3 Zf$XRcmM~ȦK?a*ӦDp$F9 E\ce2IL`Jd@RM_wh}0VsdntyImodZTױN^%R`ۺUEظP2r2׍9Q0a☃0B*SQfhq6mGsN#h(],MbDLIqkV-(3{2hm켒Bx vIpB$r`5<ö~ޞ}-ű8B~M{B%+A~Q.<IZ;d{?_ţ+ -GFx%A :n#kţ+癜.L(EG mh!h]>J*XܼsH6`WCe;31qL;M{Y.Su>8c]Dzx]ٓ@^H6H. -S0/mdLf짥=*}ϤLg&,G^ Xi!c1

'.(^:j%FBpWmF1Ak(0RJyb}Z[ n -9z`߇.^<)Ck~@a+R{q;QYb,4_;) Κ@6}\Yӹ$&2C{9Vf0s5!i:\%yˎ9!WݣV=Vṑ^^s+ȮY> ૚skbbtTF\qNhلث2J* øTz!e^9^?>((tq0]L;4!0|h W }N\ ԗD|7hR`u'ţ+38LXF:;3{RK*grd2KnKZ-oeoB_:#Ӣ\|Q\5s&{p~8egWxS̴y}xx?WO=ȗF)wQù| pmdL'CmMr2BVBsS l[H9gR_Ń ÚUB[^iiI!V褲a֨ꎿUkҀPńR kXMj"?/.xP (iLatL)*d`@7J8n5'L2`/s1clU8jTB5Cev^*s}lV\?\p!q%|ğ!MP˞o\P^@s(4~8~Y .{qFp5nmȭW_Y`^IVa/\'E\aO_ww-g uu0sq lu7z$޹n]pL}b5uS/Cczc;xBP-F3*G1U_5A9! 5܉"(«*v|(n u#s?܈wmScKݤ̊J{)fkC!@ڸтT\7I\RD GR]EIe4i M$@m`Jj֡ʢ&"Y/a2Y)gbv4coc)Y?lIvZebw _ob}":6 #KJ.*[/O?vaަ}%tbUX7q]=l--CEqB"hz0 KFrɰN@(D 9LPA:b5hJj<T+]A)Y2hM7*0pE z)뵍z%BZ 氾$׉NBVaCvPʝ!.X3eL\+-ա&8A+Nt܅vYqmuK{*^ [472\̉L -cϖJEp[8>q"r 4 q Vkm5^ez_8:lUW''DB\ȍy tZ86?z[4L`AQB.?(1V|D*gQ[FhoqVFC9䌨QI 8Jv,Yǖj)4 v "Py2y v a{%+ ~#%^~Fo(H\akuhUں ugvm +=Y]p]ntKc+">?_a"Є3GW}5.2֤Zv^- ѺOYh6)Ңe;7ٌpEbʳMaՂrSc3Z!I(*NPRA,F.(ӽɞMDʆ9N=%Ѳbqs_q2jۛ`gʘ:MyGQH$$ș@=Vͷ$Rj5ߒXHq2r&F'f؟uF5[qid @eVOlG22G$F!lq ˧n-GS_X vEhDeDi=y5>O`,lnғ!U5e Z(Z"^.Dń\H,PM8ѧ2`٤ཻMSMv˽]=b.noeR~ݻ7^NNYEZҹ)Q.wt]5]uKbLw]8%)ŚiϤv];]ǧaZkEe떖niC&H^sg.;22`#:ASNHi^:CS%Q/9"Wq%I%ZT[bW%H/xKxIqBA׫mi6D 9^Kf8>iFXbp&Tqݺ4uT1}M]IEZiv@FFUtjQ3ʺZx`U-es{j!QyzI Û5(KC_d F+m2SU}_3,14Kw9؜CN}_sZmuӟ֭uwH+T<[YB{:E^sٸ\\vB8EG%+(rfl^qF^PLDph_M{K%j8 d K/s\a_?ܦozSܧMog6ioɁ_U]5.ަq9Ji OG2l$t7*k&Y?:Fu]r|6 q5\ݧBu2VZ:IZU F"Hmtg/֎yBxZ]lkY[{Ӡ(iy=7y뱑x]Ó|$%o2K!c ZpWoz@aGEz|˟  ҢJL2͟ A RZ믜&AZb(NÅrq:+iHiZk]sk;Ds8DPеtYǎŌn5FkV, %qR5!K鈔QTAr * UBXP2zgܰZ.޽+ЮLoW/Y58yy-9(΄ond.~>E%&'Lb qfl-<9_ki> 7fP+&\_&?C:Q|n[xOxH|_ZT.2 :#w"Rzc@VA딾#gt<݊:u!GD 35StW{5D^'E.^ ۮ7^8Gc'a4-~{& PW ]+4 6A'MVE+W?w,x5_1nwe_!,Q=>;DZIG^Lb F&[V`1I~~b=Hȁ4r i+e>OCh&15.xw7ON3^<=8ϛQq1B(C389![F܎1ֵ6w[ 4f_Ÿ5.:P^f3(dJ` EFr3[v bEbj}>3wf{5'Duwa XOy1^f[W8 zf@IU  ^2r@dDt)Z\ldٻ)h3T ͬ go)u~fjI2m&1Bhڝ`$['/ xrsYIogjObyvќgA9h)ۃpЋ~}qB."r֛O59EKk6jmZFG  *4Ѥ tMuZ )X@wC@7r)I"e;WR4nU36ucfs%/ '咢>ه Rm{s$^tId&$+d 15xUo@"'0!@K'< zT}>":CzT:G7VgO}gʣA5_P)x568e4 b-ys7Ii"Mt_&{W0X w&U4 MNO E gba?>9'?oanUMj/ߟX蒿5:}|&*ªH#{w~n\,y-y3[3}[7wK~S|19" Ʃ}ퟄ dДkB~33wS>tBթ?s&4Dߩ Sx0rӇ8^шq-q :dZM9ؽ"NbDw ď\ﯦ_~ _93q"R=_>:0V(B;M_t-u7epMV0Ģ=w}جHE6aB}%$mnۓlRR^KdbPAO;Ԭz됸+[h15ӷ;:DfSϫ?j]| ͘`jM2>'H- 4}@(%ppKӞYxCxsvh̕QV͟ReA8-Xfʍ/~DԝD᳟]F>yGۻM,ED.BIDmy7AnTv`טޔ8FH:)J GFz֝di܅dEO]&7tDOT*:ϩ?OjD$uP2RŒOK_Tt_5,&vuo/.KK~aj79`r\?}s4jlwixwj_+S $HuT9[݇HEqyyW_4\|;7$)n;rX!( (D@p';Q fC_JRȂ-R8J/ԞR{mmTE*vn 0jT_U^Os [el4YN Ր;w^lFu Xe0\7+=Fr_ 9TR0AȊ{"*#Pu@m\ #ېkR^j.՟r;u W54){ziA[qdQ-Px;D&F3RLcRKP-X۵289([*ِ7A NlQ*-^(#)N j!PE'\9긑A+&-Rz\_xiAVWQL)(:Ke-e\ `  J-C(gڈ\V 7*E5a q~݄ڷQOy6+e! L,P̰wHefPX[Jcis.)hLtO4 +!D#L9K`D%7F`ٳ"^Ғ fj5'9Vٰ2v^XSƉA `D9ЉҔy$q,^@ SN3׏бܝ%awȌ2#Nыp;!sáTWUŲGCts;*g1ާRC*fG'L!\i&َVUS\cx 0ҺX-HF"1 ZI%e.VOֱ@RP›G:t,~4Xs#b%b4^{؉9`gf@EðS² x"(rɑj ƵiS9g0L u;`9v)  O cS30;8L/FS)ݷ'ǰyE)W6TS!S4ϧ8+},.ʘ'UB[[z .|Tf4*ڕ[n ųƉ28>Q#C+uegv; D㷡 WcC  Usg羪0}U(B$'OTx3aETC},I2X L9S@%KiY(%.p, >^R#c[F wcgHqXqWƭ;i`O+a |w0!c`(qe$+}lIp:\GoK!Ď!?b'1QoPCDA-etx!PPC@b͸Ʈ)#N)>aH?do*yfF#|Mk'Y$ p3OU#Zh7͛K@tG]]U|vR<}<IAhT;Em#֩nPs".490S|<lp2Dkt3~v ]mug˶ iK-Cû2>z_+nF[`k䓛k,\b\FZrlpNx05Fg7iѦûDT,m'Q$5 qQA8F-IXª:x%~ \'rNd =Lp5z)KjJ̬H u?D)nabAkV(gȐZ]]e P/ڔZ*!lˊ0܍w:x;(Tui vQJގs2S-:2Uh>Iׅ7M7r?c՛2MD٬SJ{Uhر꾷^>=5U-'<8~*g=YnY6%ٯM3~-6F- !ѻa!?fٔg!oy7YnNgn'Nkw:n[hg;eggP4ۋwOӞP# v0ǩM?o_V "{;!/*܏ƳX1%9^Jeo%.X~m>d}δbL܂96i_ 7hLP cX%v]@mH[@(nQS[ D 8vMi7 ʀY"vV]Ý#x|VWVݵ|k`m%)b&xxzp-6Sqj"Ϳ̼xSB#O5j%O> :&^ XZc0Ž=kߍAH8=kP1Y*̠颹ݥfTqMlRx|SV˜q .egΏSް810 7j!Z%uadSDn07{ob]mEEX><^ך0};Fs^;k@kr?2`㣼wnu_uw~均%t6pWN4mZz*lިD(TI!4N}Y#>Р*pj")e jjy)DN>MS/)^PğL5C9kJ1հggBfԳu{? ࡰ[[oWTF?C|cN]Wg<,2'e<,V2^Y=.pB.bv $ͯ$_Z#K/tU#$H?rQ^50ޘ.?~z~0|+)JY7XpE(+a0т:!L ]4)m`Z@ʫ74@ V~t-0>U#^>)"Oݙnx9J8Hԣ6o&ibgNLH,ȧQ9Vc:V;> WWs{Eɡ:Q WDziWnsEő5T]Hu܋9miHӥL9i~ݐK]c,o?=7mE#YIZ&W*/&<^QeUFNw<ˢi]GӺuMnZ7ͤpeDs%.eiQw66sc/, y-&6pUJ>5]%rt9*J-4b\ Xm!/#wDTF^xf,8Ax)B&.⒘@)ŽDr R\Ab} )o fbT s3Rb c&|,)d&.]aRMUƣ\>Թ|gU SƖ303ph~jt!!_*מ|2BSO*yn,O$F|":cK%zhO0s;ۇOS,{SjE+&(' %yV O4SjBq!y;\E0aJ"g}uvF+S5u]DEQsP|YZIm<:JQ'QJ!l)MmU1?GTKumAK4W?=VY.iIM蝤k!=356/,*7l:QH_]Ѧt\vsz``:p (J,qQe+v|S*!4ki)I1 "ZrF"{Q'Q뷪yM'0q.yqFJu΋ͯ+ZK<=͓}^Gynu<)-X!Mq$BrǥqZ"!.PNIbAY,6a8ѠQk-³ z;gp䢸uMqcEܧ$Ѻpę0Hg F"!hP%穓zc̔f}J;$z~ D٘0%9C=ȟ ,Kڇf!^s &׬.^V;ecé5 1zYmQ!wtnADp8vUm)^{ DYϱŵ>yXOgtSa IfIr'j9緜#I+Uq>g 1XWQ`(nw,2^2)f"UQ^hSb,фY)oh,b8d1S)0 'ޞVRcT#;~͂yY0Y)!J%gaڔK_E[)M< \VO-5y/JYb sJ=YE#NHcU^R{h<R:-1i$0+n{!'ՖԲQ:^J+m8E-&J_^H"Hj'Pskrac_<>Gq}oNTŸآ28}#H] `će 31B%fi,:>&8R_mJ⯎E;SoYX>6&Z˶R*ӬM RfXaj/JY֖xG+u؃e8 ,Q|CYc=)%YVR\lJVqn)"bnjSjAK["1XT" I)ŗ̊H,e\JiRƖʹIU/BjY#?.X<O\d5-d~m劜GAeK%1tpNWH!=?QR˲+`~U?}rgN/1~*J2I^_(pCrM'>4;SrQJ*aFV4%ۦǪ QeVZ$# 7WTB5axL('T|A^ťCTUۂjdmTY)HaTY)GW]V(l5EK=ʓ>3jt}JV)ڸ -a:j8\ XSV]+r2E4yjma0\ɹ}+B4rd1}Oo˭q^7,wiIKˍ4N_h7^Q1h;W|ÞOǭTd2) ߑVNk@fWSNz+M({_I4Y <&ʊ]svRmh!1'چD2b9B)qzX%%6Xҵd$ ,⩋=J qcoi>ξ!X.~Bf)gc‰KphFQ:A XHRऊzCTr,h oQ}r xjmN& $x}mjM7mGDŘ0F}& S+Γ07yǸ7^b*ِUwҒ.7:$׳[Gj_uO[QN1*߆jyc3 r41t)Bf:x^7 ҳT߶خf:|aLwq<{D\/志 I~6Z . #ś<+ u%f0y!jN@GϋJRƊƷ( ZOd R;]`@,׉I=6-4[oT=!*Յ\#tЋ#_ɗ5oO;h=(Jڰ۟dC d,0 6yK?M5 ?se6䦺8?̹f򒭈2٬N%@k3o 9xϞn.v2dgbEۜرk=eȥ$tSZȩ'0Sefu0S/~ZMR(`aۛ|1X6(j#fAv6``ti G3Wµ}*l!Mg<ޙAKhn+I+v"`Ll׉-pw=`99jgww'5HX 8́d90!wLlKH&S%RsTaNyЅ\Z5&lcSG}}U1ҵ8|} '>~.G{?s m#50*]ף[/~3"|ݨ[WNon{"Pn{ [;7*x\ꞧB57 oN@ 7v XPhXSM>ams.pA䒨ް67ı;|>WRzܪ2{fKu[oji[-Ow̝>Ҟ7TҨܗJV 7TI,@C%1k ;iCFUt"R0*WL;t<0p].^VI?;o"I'亂[.&5t ǴrKf1UI|P~܁p%8#.̹JFۻ,NRT&Y9F4UkTŽI ϽMk$4 zv:A;f+#C"Ro6|+ `㣟^+8p0'ddBy@`p[gKBc!?]86*?7VWR[$cF1bT N! ȓ$(Yꨃ%pkk8Mi8WH Q` =^ Ylpn^~^LFvX6zRm.E襝FŴ5biJ8 yȥ9g4iV\4n?L_BJxҙN)z)c z6˪_^-_ =W9ݫ`=V eѨ+T*M9^_P7eؐ5NW>WT3nNU] (S =VX'+=6+Jc냰Rʬ~mJ3VV 3hxT'xK$T*)C{fRJD eP '@%A-(⩩1崶om¼6RsVj ܘ+LE [(hpDK7CkUt;p;Gv=hE*QbH1H2tIٕ`\>n)5'RګUjlPn:뭉m6ӵsTIU=0JgBq6tI^kb5=&è#lKzO[au{ocX>!oBlڰm7|+b+g31ځM%*7/oI:cnP;YF rmlm2Gk Lo-s+ pl%j媨<NWk$E%ۢ{A؀j[$_?oROK1] JSq)E'5mJd =Uej[y6/E\/\\gj8moiؑN{YK]qF_K6i>;ycXܞsxF Ď6O7uQFX]8|;Pptd 0FhkI_R}5ڿa%r^SQ‘]_r+.}l7>>rq: 5:[ » :/[On48&#I x, H@wȘJTE ΀H>)F|_kh^ŕ#vh*Xbc\E ?bР7n oThg&Ds&q^Ƶ?& \X:9nMVyK 7SCBdZҸ**'ry1qwP'M\1JՅ쳗 J⍒BU\ 9JڄP-[.C4)"URD*eS \2Kf%AkzBlcJy tt"d[/G \;zW兲N@.`ᥧ+ey>3Wj "ү@Xi+ ڌkmxmxɞ X i@͟\oT;bNO j}zg.* .Hܙ$L|{p`uYĜGB2$T՞O. Z0!?(0 !ܢXo Q$@Uhi ɐT2ϵE||?ΫeoY є#L D2LJ &V^#I+߇i+y[0m)Yȼ3Lc8h|ejS= ^T= >B,N O/Rò}a=\+5,(&zԑ᠋eBT.HKn{o#DJnubNNU!+j+ XT*|FN0s%*YAJh.+G\y.xaM) jlиZ!Q xKe:Lk8":yPGW:ہjM[N$ @tF'n2ri,)R>Hh" EP^0]Bu~nR?*ʣO?MB;1S|$RlrDCȿ_>v׉fX?Ѭt4o2SHS1`!^u\{s|a,:}>oC\[Sc=N/3$g˳34?f瘓 ^hehogqvnގfaD}Puqo0ʠ=|bU}}|Iȍ@EsDok戝gI{\{ b5s6jօ0Ss@C"QD%b _)D8 o! L#P fѝR* *نӷ7a2č+w1gJ1a{1j7;Bs {ƆlZ?{msr"Er}9;ȋkyڍ.^Tό=#ľE,iz.ɯCn8e*+)Ss 9 ¡} WcH~OvYZ˜zqC,AJKmiG-6f#ThAтM@bxz3 pg|[rOdU'>uFEI DΟ1K*(&e >{/\j2'+KjfBX-t@cu:zA1 yHa,ViJ3JA $t i<]iz9u?JG(->vXXVvFð[7zG@NV]XM_\}G wdk 1&!(yrN_>dYY^7g̥Vv vcx8WE+TSV5?Ԣ(i"bMsߊq$L&3 (B1Z>!à *3(`S[||y}V8p_5zL7‚L5&Q5ڰ o6KMSv\#:5-Yes:h݄GN+9mhF=>O&çRlgs߻&LjH #vyeT㻖}tQlbܻ 1ً*CmxK0Wmvj6Pj[C {Ѷ1^ 04v1Nbbj*1>R-SBqM uD{tkQ{n+3?w֣jjmANSə`(FP BD oX~HnVw{j7i4_Fahų Fi$+4)9 %v9EWR@0J2lINk1iBi JmXwWzC g"{muc.񨬁d深F6Fsݍ*}f ڗ6mJGZL&ƪh.HD EY%bG(1JlϳOoKW,vo7Ж,wWEr9X^ìH7 ']~6~8Y^J!f:攃"ݜ]{;5qH L9 _]e6ߏqz2\\ZYkF! j?;C`XAt7u:ZK#ee)T pQ2?,y_܆EtN*=D UEoq`1:,Ѡ2%c/J%&r9cf(tԗAsIrٹKw=cdRhÂ5j"}P4]7לt. |a5?ɏeDY=v2_߷Iqr&<ϧIY&9!3x9*H.dT$s~INܥ;Ї]<9_ @:|v~.ߵp1>N]y~?yǹ|>ǜ=8:(ZzǃαPIMU]伓Yl8lZVr[W!\Dv/{1g2ySB|l< ALAqs jNدavYE>jaH%)$AT"# !\։6`۱vUVZ2;mUq}J\kW)๤](l p/`]z^zNEΜku[6!kW=4QN;ӤEҟt2 r)s{"Y"̦ojw#<@h W,re~ś>;a-̨s\ֻ. ]W܊1Qׅ>62lu;{x#9~kN~]3zd< * ڌd' KOd%ri;ꡉ"bl>N'c}ޔiO~&fX;lhT3IP [,kсS Q41X.Ҋ'T%*Rq3I c.9b.\Tʰ/-+s+RceGP\tnf$O_QXyRZI2~vh5(ٱ D/_[v]><$N}d M( ((a:Gw2sZ^ZI(ÕW]iWhu'C)87z Nκȍ-tKiLAO^`Vmy7dK% i?:)4DG Qm6lBJME\lz:r<[.eiHyի=qr~レr7Lgńydj^Ry⢓Cf==_ ,H0 XJQ1 q+hI1*4VWTȬ d[,䤃 % m›H*+yJ?+FùGX b{vq U] [q 3;pw Vs@W/uqcH/rCӗn1Vo PF3^ ti*-Nްe ,MJۓ7RU 6Z ijFlc QyRPs)X9^7CTitՏR`s(`r KhNV٬"8xtƋ(N9Z9]z^/;']{\T Pm=.q+ e {L=וXuԣ:|/I;}tpxV_z*Bb4sbf﹋hxxj ;%/ŤTv6ߘ+qO'W>}NO"-2~;Ja\fgt[WP<4|6#%@E(`9`-9LܽXV(gm a_rꭜc8]r)hV,Xk8E>+t& S F-pT8t('VvXX7VA`+:a%8#~ItØR G(-wwK]N?9I#j.X]szYsQ5U\,Ę梗EV5;}Y\P%n0LX~5TxiU2 'h=USNˉk.L̪07JoPhh5&* 񚋚;rلVgalׇ(w#op*lɢ,rs}sf؍z̓z=YΌvö8k!uEB򑲒"F2i2 Y{ ucb vr4_CHvYF@TpkӚף7c.J <̚!>ΦPYd-Rϛ'c\x\Z%=]\c%5& p˧;n2J!Q yBz2è#_}?DYR$ R$GSh KG;&bf+@Si&iZF&æ`Ѣk84͉2jNB J'*hTsJ9 ޖLǛX\ľd˴ԯ71PdhN)޲nA] (e'zLK dUUւ|*S>ܶnۣuAľusdޱ[<~ukCCqT `0prVY󡬲e8H]_ī=DJW{x"PKN$R*ffF+Y\B&SWԈ3Bsvn@? !U)$ni-uID编2)VTRR ^K aCTOV yzZJg-=i-$LK)R2 -$LK *eT0-k)o ԲzEV/I-MAbC?{Z}>K$O31R,k!6SS;R+9pӧ>^V J^VS[^:UqlF6ql\v~wn }]Ii "%샾E._J4ĶXlU,KG 䗚 )_$m/(M F$/(D vS4^PУ9 =#a,.RLz8cjo6}FtビeH;"}I %oYepFo}ܑ /L'oO2 !U*oUij8Ÿ>&Į?Ys@$D4!53g^xC-8{MM~=+AVBiΰG䴿M̄v &Kf+?,u@U4גr6fIJΖ1jJ2',Oag׸YBy]ѵ { NT,f2|\uz%8*,d4öw@ۯK?_}{m1̠4sG嶣zf|*%P P] N*O tMXNԮ/YwZc.'-xAֽ0N"H8{qRDvrCh/i Fmk™l9=su$n"\*Svr $AAm!ȧg* E:Fo ŐzN%P}!;>,+)4D1NEҜps iγ$\+ɔ$@s wz&gBa !3N+`*њ)R,'Ƥ } U*%塵T&LʢdE'<9b$ȸi[U^V I$$៛{tXt/J=`cYo*My-&R^:$eX5(:BW[&.brb^tH?JO}zt@nX6γlu\Y`4$X,'`e|C%2ic i&a|Ðv֛&BWLXbeR0ankkDpͬ`9u)4Kmݓn'2Z@6 ZBM9UzS|,\h)D9 Izz5h4s:\gis*lh }exo:excqi;X9Dok2P<ۮ @l)Zͩ EU5 s 1jb5W|{uL S-ϑڛVC˃ȱ?{dv;>`݆|*S GoY7MK!{n<QwԱn]N->o?к!߸T9<jĹmt)G˃Չ}G.Bkݝuǟnmh7 ,.a@ʖΚU':.('|<дT+pKTi%  վ8kik)j)N -=DJOx;kIkaZ*Y BK%҂j@q>a=a-$H܄r) %d|?6R _vZX <>}Keee9,SF{dۣ-JV>wk mf6z%]QN$vT՗XxQB)=t'S!Tmf7|Lt8>یMsE[xzD N0R bTB)&Jp^{ݺj赩:j]ZJB‿̹#C.4L'o`q}q7OVicL,j!hѨ}â bF`b?X3sJ7b#k_¥(tm:(̴r#؏O X ZDb:D1(($y3k,ngcc 5CU3䳌)c=˦lTӌCrbvB &{e~:Z^{%WO]s=s嫙͓݁/#UȤd 0)ZЬYwtBJuU.,K,ᙰdҤnn2FIc$*S6.l* 9-cҜ+*- ԈDii@9Z=0YT*U bY#D1҄jɮ1y }tGYJ"^s&#OL )+`D6I*DbU9 ꬁQ=NK-&;i^b&lTNEN(ᚫڸO#ov@5Y6n -J!ZrcN"3Ƅ/H$9\(2®BI?)|QA`EP=ݢlvim%<0*Hm!ݴ5ƚf\ >)NtNf࢔!~变"%2xs17G#/0P(Qjf{,iPqlW9IbRKSMIQHQ*AQ|E\:e-}*& fs^#؍0ӕLloQT3QLFL&Ż Pa=>!39i1͈]>*}|}\q&mo~ K}o%vţT). U-^q^aY1HF97sl# g['E[ Э{۔fR=Se.Du83hvU]HR)0KvFT$ܬUĖHHFZ.t;=vQ {B.R.$ծT<̬LF(Y M<Yq dhTEQTZhSCB*jQ#gx"g52;Wxɏb3H?]mx=(1Mۚ\eM2Z H霪5{Bjd0cv EwxyVe;tum5nO_ԍ]~{/_8Cc+f˕_U-4hԆ?MWp[!|VXN>Ao{l͂Fz?D{a噘ǓvNZŏN/Y?Հ7M` )?} b}5jn/-=} 1 8Pz !r| !N]<~}wۓ㷣17>b ĒwIX!%<*ŻnDc͔C}<]?VOCMI]^6&;ojqJڹlK )X^ÇS=wTk$c 9ۤފu{ȴO958XNJsT.L}%ήjĽ> -/y6 dZbrFI++x.#J;D#X+KcN 0ٮ!mRmN[ [Ќ ) 8ۅnB8qw`+FVxnAMqУ[p[;(e2hڅJurc77GϹ e9v {nv"-HC=!('5Y;gCheyvN>F='K۹d//ˇcXNRٷ^#CSsylCNuCN/?!g\C|!@(83v7;t;=^tdLiD^ɻ;t]'sܢ]U90Gc;)c9^$ \?㗻w|P&Pg`|quȃ/K?u>RPdwGe[=/@,kҪJUǪ:F'()fkA礮xws.[ub>ι浖5Ci~{;JO'"?kV-2ylHvSnUVR|ܢ5;:~RZϽ\a)7q]$%/[[~rll>He)ylB^6!/]!_Osɪp\ C(*7Ϋv &iRX/wZ+|5v ߶/o-*>藽WYkZOc{,C-?[)wmm\3½mmOcn/I:"[$17&%јMMhh4Iy_)^|{*=?iưС*@MSSh4 IsK" A|%)g>:KaYvu2 ˻Ha4J7# W 2%Ҩ)?.ssq?U 3;yN@uY^o 8cE}Fx mq5?kՄЋU=/|Uס4W eRJeqjMm; 0eA'Y/ȂUd^1*oDCuuݽ){_ޫ@ D?Ty 4x{ȟao =,Xs>2ѐL)W«ZƔS6ϸ>f'{s Mɻ'# V]6ϒ zoq/Ȟq@ /룓}e9gXQjL4Uo(Zq̟EK*`"2G Lō[;6+ZM*XZɊ64>F+z] r%=Jm3^ub*-yrr\n}zpAVb-j}_-xo,ɯr(mMgI8E /d`]kC񉅬?GMguzY4&$Mh V@'ThbXͷӳ#Qh|ΨI9I'6iQGVIB?/ǣ{ 0y:_5>ȘR5] ;Eoc! %kNw __%(MnoKx8J%X¤.Zs Qv֯㤲v΁LCLgjAOq0iJmg R(;@:*0N*;dϪS;9u:l42&QUatE$$YBwg1ׁƯFW9p%t6[icPSٚQeGWqR9mQRIZ9@ lZtDʡF+J?V =d3?&qRYpsrVX* 2c;ňPҲ@V#n ρq‡D;C]fY98\o~D ؑ4 枝{UHkQj=(Sio'fr?TZmaFM߷HoۉSySh'idSBqLh\H%'f4zuz~9ƣ8;+\ 2qxmFk3?zx!m_k5Dɲ+|R! Kwe$z1ERyD^ l0̮4gݔ!)iz wG)DQRUEQG7Э_Dd`4;+{NWE_Ir$@(5ĸ02"ox$Pp;URgYǷ`̺Q{m:rk/%N'IhgAkSoثF8*FcB'ﳋ8& mڠ7Q͠}W*pT:51G4?0ˑkѻ+X*1^\xp{ذP.S~>ACršTo] P?x*)O7 񗮱P~eLya-SF /~~[F`%&ϒ9=Zp7f-+r);tUمh% 7TSdb~maFed8/Bі ]VHE >}~ W!UނsԾ# z>!YOY e BYYl#sͧh\h'o<70ى& iN7D劐(?hҠK4vZ$&q{ol 톱_ݷ0GY\MrD8gaJ$˪ m8}dQ ~_'6~')&cA8Quz89GBQtP[Q3 5+}`hWͿeVqw+k<=/eT߱|⪔47_|ݯK̚ߺ f& .\lucYÕ&/L/i\t>[ qRSh2\zė2FJB㙦t`m(8m6_NHš_v J !hIE&2WG#u\QDQ#1^3jm2T:d ѬPc LrS**% Ia9wa^!rL4yтd p˓+񀌼!P&Vj?v" (H4p߈R4K=*m1 9s.[)9˔:mȨ1Js*JccTj>|_-+UyUOӃxLٗ.7C}CʔC7?2]TC4Oկ9>O8* MϟNP5G~LyʊJX}xp/nϰyRhs2}b8>4Q\%Zng鐡$- jֿ^^\X׶JqSJoO#]y2NRxnMCwGfen#yJb#oOO<.O/ȟ w~S B1s7UL\?y#JOŠOur9rrrr9tzʸR>{gMg|ı a6De.RX~;|(FZYd[YZѵ;R+*Z+ ,${5w* ##"9Qb 1<8udM rb͙Cfo^U~hUu IiD7^c-'ls4yT/|ɯm//Vqh UNMRH  R AX>{0_ :Vˋ%ͻ>e?Ǖѯ- 1YiэKQ5ek_t gAC`Ϳ1^ C]( %%w?!c?BjZ}7j49Qhs' ٛYuCF?֐O5aE\ݸonu!rni<[tQ V8߆1~82-SLs/`4VܼX1zÆN~kG!Xk"}8Zwo5\n$0 R]vZv}hb7چNuQZ~Y+jNnZH/| n){_|K=k=( `[ਟX^.y>=weoݡۄf?g|QҺN5]{-~~ղ2֚%˸ ܃'~@qՏ%Z,F"OΚl_EՕm]ĸ5^Kt㊑1h*sXIJO5hI;V2UR?nVvkK&h O[ [ELY{۽ /Ҡ}.%[lٷvk[vBB޹nȔl(Srjh*$"=t>;.%wѨ?]zu9wo UF6*F4WdrsbA!$h };\1azs,2/`dXvA1O<Jx aAYT0#Sָf#) Fc hx7OaYJ|"ia̼=vI:XO)hV1<7dR*-ͫYZ;R@ {sRRuӵ 9,؛SΖ8ޠZQ7`=6G%{$G:Z[[+8`5CSqB+:-<-e8u: `ܧr Ftf%! d{6'A5HU$ &^%0Sdm~#XՇR%0t鏹#fʆ?DSbH0)tV|NSC `QWĘ?("j"R@[M(iOpilW;Yr7f(\45pe.HˆdA〶H[Hpp4^v8 OP\:PDCS Dܐ!`Pu_;u) 9ΧŬp+KC . |TBZ^jjC}YKD 4K~a]{b [jE^-&Z1OL kFpQ)Р!'yRu(t~*.KՍb0G)N-M}H6}rzro G +W˫VW&S͞O֧PΆaI͡It~R3me߽o1a)Juv~dYF,1ewS>.7:fNGmzѷ Z%}h?I]o㍢LWXca '̆h):Tp7dASʵKL%FSfEV1-;hZ1 hoHG>*W"w. <8$i0(0Sp12` M)z5JUz 3ի+cj"23oTr6 F3٬|uEz@pkW%UicU+VRYiʐJ{o"A ECC>`Jx.HuHI_&_#s5.j!hTEX  pIDfhpAddLUB`Pcm!@hH(%WNք@!XC뗈N!ci !4 }X*hr"TQb ϑ 1t8&e$3~>"Xi#F1 !INVBB f{(/H.UB 1\},1d@B(_|҂ʆCZVGB†?G< UDr#h"ȴ[zjYu4Bs(3@ƪYjU Dl\MFۑ7<ܴjߍQ=z>V iOشPּZTR~'ɵA}-,-.ˋG _(y"5n: BJW{.7Gݍsǭіl~ DD ZxWn0(]:zK-6o"Cq#?9_j +a;LpLw{e L7wd,:2Lc)jVܲJx/1w(ӄSI],v \dIeN-'G@aq5R!,4Ă$B tPO.Br<*zLI x9!gVB x( II+'ēMݐzHG}}WgI[oVkdeF2CL8e,x(Tn-zf1xzG";P:„R[6k%5u@{ -Fb;\Q];']; -w V2 wC5@QbDAd y*S N॑B;4Ɗw`ͱJҽk& TUVE5i Q q}rW1#G[9AOq/EVK=!Wi%֖=eO_f88*ů ʆJ]Ւz%HnSU[UgWd#:m.cQ;jUR/_[c3ZOhJ@Ҧ/h,Q\- ʯV_JTuOv+ң( o/ 82f ŘDYP^Q'dvnt٩96/Џ>^]_r T.Cl&;]݋.%7yV鞽.a/=+:PhvS! k&H&UL 2?12~ꤡU+؄CDY'kkMo#8Sd dN!&l"f;_ st߫7B@HII OT\Yi*cCtтh-{kv)zk_iR{%5G8,)2x%BH$2%2-bW|FG"?׻)| q̋P TڭUO*P\}v"6Rzo4hͨA]b Y N= &0'(fwץzjygW"1 Av?w"%L;J2=߲$FVUXUjEʩ7W?ffuMTKϒVe! n3΋To>mmf[?ֽoλܩ2b̄hp6 gƱQS7Nˈ-6$ۺooDF(s%YDF6&*eQ9[-]Neڇ{K&/䥎g?F.MKU!ZT HѬ(mAYqh"Đ}:_  x:kLb]p8iuh]Jc'#=-X ר_8ϰ||HSs$Gp"E 'r8?5.{Bg9p>vӇ3-yXQح{ZH-Ɂ|۟O6L7L/At(yԇŹ/8z08K'CBza<sΚ4xv#Ɩ!`j0po/ymމF%;ǃɅGTfi:{%|Ґ_hi)rr,Np7n}rI4dV*|ZxcxQ< L.mb`0f.g V9,4z2N807`A PxŲf_tc\s'5!Pߧ#@v^#uKeY'vm˞i_`aFd:1am1TYk,bŏ! /d C"Hb 6V1 ,n[&?H[$N|ݺť܅~x}OGnSG=eso@fȒٕhJ:hEP("Q "۷@+rN ₹yޥZo饠լϕ9 jK(7g> Jѱ򔫰x¢P}ԕtMQVMzӢ^rqwL-/2=w l徝\7O> 9 M3bB[/4=z:cӡ̃74:Ym)G!I}ѱ~6u+ˬQ~; ly=ɶ=ɦdO8fE^2>fi`.vZo4^1+Ōę@L7iGyT~vfȑ4%PlmAky{}|3yN#a谑f ;)3!IkϬI-SÊ'r(aIV?އyv2VT/ʒn -xwo Ãyk ݼ?Xj lG?ʏ}(rbaB; oeXxzG8D| .Hi3]lt qm6mN6(~yVF=؈F+/ΞX}UogM*U]$k7O_Khet4U_g,b2+RfF\[NHY6<3d%:U TRB 1D2"E_eM]hN(FyiBc{QavZ>K2[XY̐9*\Q9;Bɣ,$LǪeBm_M p#lJζ~nW;! #B~ tm-<_ 6zxík1FeQ,IN>gEiY$qF\z:*8XAtUpڮ0Ϸ[=O6 lki%1riMJ-_hഠ0..قJkZp44{ؓ'Ȧ KO։zpb IP]%T"h'BHx,#h4!e.HRjR@2Mi/^$6~sK%0E]?NKt߿ .]aH9fyO]zMF-9lF~_BOmSsiU4+sqzо YyMt %u΋#mʖ<lWMulnK8{o &,LAbSA3U4]Loې6hG&`ߊ54vKKmk.aMAxv i:"]6+Xa2v6-.Y) N bmY:tJ1砌 ~g8VXm4M&ۮrh*h=CMbW UdT*$($=F㵌{lm_`T0%4= ЁDu9h}NB0 AI \ܤ pF^Z&e\NmǝZ/VO>Cy1IYIH eD!)&ѕTC$0hcIe SvlPJThH`waZ~i1 +W:5E(ր=-(X(:r}b{KM #ڶFKY,^%(y$*~t=R2Pл87UAvng] : :ebt^%!=P˭R/_[c3ZOhJ0wA\m ʯVAK9mV0Z("슛UpfK V)֥VJEPkى`(0k"y}YiM=wNa7ϮЏ>^]_-  %'&g,^ {nmj r+dL7kM1s}SIIonoU6s?~~n?Ƨʚ~>J_9׭H_o?w =/:"b_,2~sJ9KV+~kB3Ѻ8[S1KeED_ 1W :AYyjLjTЊ}X}.RSgⰿET=2񔜽ްNG9'' i[oLCɐt}u˜gրHˑo*ͧN99Ug5J-+k|Ӻ?'V47ՇۛO+ƽ x|UA\C%J{(m{B`8L)`8auO ;`61(sx6C' 5oml߆@nQO=#н0X->CpG٭ړfM!c4U8`aϹLUϷU tVk3eBј.wd4*X>uIPMh89]#xۥ4[遙%?j_.I @1.&`zAJW)#߸;bg3^n[ŏ@Wm9N^S7·k Qcf~ 缊dx0 yR{;F^{XD)EpAޮ5ZyAnŭ-XXS lҶ,-o8؂,p Kue9RDaSt] f}]:9gBM_)rףu P"{0.iFǘ3U6D %*I4!eH(nݼr!L)jD\i@D4Hdr欥qelUҮi]\NpĨo7*^B#ц*~ܣ xGn+0\G!KQ`TYy6e(!3UަD!ayU~b&Bk# W\ˣ&p?*zZN{T rpUPWd@{T.:[mTt{Zzh=RYyN^h3ykRڕZ4;Fp"var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005730175115154127006017707 0ustar rootrootMar 10 21:50:11 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 21:50:11 crc restorecon[4757]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 21:50:11 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 21:50:12 crc restorecon[4757]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 21:50:13 crc kubenswrapper[4919]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 21:50:13 crc kubenswrapper[4919]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 21:50:13 crc kubenswrapper[4919]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 21:50:13 crc kubenswrapper[4919]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 21:50:13 crc kubenswrapper[4919]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 21:50:13 crc kubenswrapper[4919]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.197538 4919 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.208926 4919 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.208961 4919 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.208971 4919 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.208980 4919 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.208989 4919 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.208999 4919 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209008 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209019 4919 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209030 4919 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209038 4919 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209049 4919 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209058 4919 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209066 4919 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209075 4919 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209083 4919 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209092 4919 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209100 4919 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209107 4919 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209115 4919 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209122 4919 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209131 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209139 4919 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209147 4919 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209156 4919 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209164 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209172 4919 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209186 4919 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209197 4919 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209207 4919 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209215 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209225 4919 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209236 4919 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209245 4919 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209253 4919 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209261 4919 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209270 4919 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209278 4919 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209286 4919 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209295 4919 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209304 4919 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209313 4919 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209321 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209329 4919 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209337 4919 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209345 4919 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209352 4919 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209360 4919 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209368 4919 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209378 4919 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209386 4919 feature_gate.go:330] unrecognized feature gate: Example Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209418 4919 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209427 4919 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209435 4919 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209443 4919 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209451 4919 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209459 4919 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209467 4919 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209477 4919 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209485 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209493 4919 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209500 4919 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209508 4919 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209516 4919 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209523 4919 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209531 4919 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209539 4919 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209546 4919 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209554 4919 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209562 4919 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209569 4919 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.209577 4919 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210459 4919 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210480 4919 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210496 4919 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210507 4919 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210518 4919 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210527 4919 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210539 4919 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210551 4919 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210561 4919 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210570 4919 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210580 4919 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210591 4919 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210601 4919 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210610 4919 flags.go:64] FLAG: --cgroup-root="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210619 4919 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210628 4919 flags.go:64] FLAG: --client-ca-file="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210637 4919 flags.go:64] FLAG: --cloud-config="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210646 4919 flags.go:64] FLAG: --cloud-provider="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210654 4919 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210664 4919 flags.go:64] FLAG: --cluster-domain="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210673 4919 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210682 4919 flags.go:64] FLAG: --config-dir="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210691 4919 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210702 4919 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210713 4919 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210721 4919 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210731 4919 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210742 4919 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210751 4919 flags.go:64] FLAG: --contention-profiling="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210760 4919 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210769 4919 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210778 4919 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210787 4919 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210798 4919 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210807 4919 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210816 4919 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210825 4919 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210834 4919 flags.go:64] FLAG: --enable-server="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210843 4919 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210855 4919 flags.go:64] FLAG: --event-burst="100" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210864 4919 flags.go:64] FLAG: --event-qps="50" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210873 4919 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210882 4919 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210891 4919 flags.go:64] FLAG: --eviction-hard="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210902 4919 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210911 4919 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210920 4919 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210930 4919 flags.go:64] FLAG: --eviction-soft="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210939 4919 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210948 4919 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210957 4919 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210966 4919 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210974 4919 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210983 4919 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.210992 4919 flags.go:64] FLAG: --feature-gates="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211003 4919 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211012 4919 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211021 4919 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211031 4919 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211046 4919 flags.go:64] FLAG: --healthz-port="10248" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211055 4919 flags.go:64] FLAG: --help="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211064 4919 flags.go:64] FLAG: --hostname-override="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211073 4919 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211082 4919 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211091 4919 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211100 4919 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211108 4919 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211117 4919 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211127 4919 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211136 4919 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211146 4919 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211155 4919 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211164 4919 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211173 4919 flags.go:64] FLAG: --kube-reserved="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211181 4919 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211191 4919 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211200 4919 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211209 4919 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211218 4919 flags.go:64] FLAG: --lock-file="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211226 4919 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211235 4919 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211244 4919 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211266 4919 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211276 4919 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211284 4919 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211294 4919 flags.go:64] FLAG: --logging-format="text" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211302 4919 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211312 4919 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211321 4919 flags.go:64] FLAG: --manifest-url="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211329 4919 flags.go:64] FLAG: --manifest-url-header="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211341 4919 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211354 4919 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211365 4919 flags.go:64] FLAG: --max-pods="110" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211374 4919 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211383 4919 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211418 4919 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211428 4919 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211437 4919 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211446 4919 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211456 4919 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211475 4919 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211484 4919 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211493 4919 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211502 4919 flags.go:64] FLAG: --pod-cidr="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211511 4919 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211525 4919 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211534 4919 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211543 4919 flags.go:64] FLAG: --pods-per-core="0" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211581 4919 flags.go:64] FLAG: --port="10250" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211590 4919 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211599 4919 flags.go:64] FLAG: --provider-id="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211609 4919 flags.go:64] FLAG: --qos-reserved="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211617 4919 flags.go:64] FLAG: --read-only-port="10255" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211626 4919 flags.go:64] FLAG: --register-node="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211635 4919 flags.go:64] FLAG: --register-schedulable="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211644 4919 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211659 4919 flags.go:64] FLAG: --registry-burst="10" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211667 4919 flags.go:64] FLAG: --registry-qps="5" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211676 4919 flags.go:64] FLAG: --reserved-cpus="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211686 4919 flags.go:64] FLAG: --reserved-memory="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211697 4919 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211706 4919 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211715 4919 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211727 4919 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211736 4919 flags.go:64] FLAG: --runonce="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211745 4919 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211754 4919 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211763 4919 flags.go:64] FLAG: --seccomp-default="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211772 4919 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211781 4919 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211790 4919 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211799 4919 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211808 4919 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211817 4919 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211826 4919 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211834 4919 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211843 4919 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211853 4919 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211862 4919 flags.go:64] FLAG: --system-cgroups="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211871 4919 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211884 4919 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211893 4919 flags.go:64] FLAG: --tls-cert-file="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211902 4919 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211913 4919 flags.go:64] FLAG: --tls-min-version="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211922 4919 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211931 4919 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211940 4919 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211949 4919 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211958 4919 flags.go:64] FLAG: --v="2" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211974 4919 flags.go:64] FLAG: --version="false" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211985 4919 flags.go:64] FLAG: --vmodule="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.211996 4919 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.212005 4919 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212240 4919 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212252 4919 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212264 4919 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212274 4919 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212283 4919 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212292 4919 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212300 4919 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212308 4919 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212316 4919 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212324 4919 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212332 4919 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212339 4919 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212347 4919 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212355 4919 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212365 4919 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212376 4919 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212385 4919 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212421 4919 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212430 4919 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212438 4919 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212446 4919 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212455 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212464 4919 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212472 4919 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212481 4919 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212489 4919 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212499 4919 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212508 4919 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212516 4919 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212525 4919 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212533 4919 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212544 4919 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212553 4919 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212562 4919 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212572 4919 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212580 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212589 4919 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212598 4919 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212607 4919 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212615 4919 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212623 4919 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212631 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212639 4919 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212647 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212655 4919 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212662 4919 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212670 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212678 4919 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212685 4919 feature_gate.go:330] unrecognized feature gate: Example Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212693 4919 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212703 4919 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212713 4919 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212724 4919 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212732 4919 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212740 4919 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212749 4919 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212757 4919 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212766 4919 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212775 4919 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212783 4919 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212792 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212801 4919 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212809 4919 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212818 4919 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212826 4919 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212842 4919 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212850 4919 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212858 4919 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212866 4919 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212874 4919 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.212881 4919 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.212894 4919 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.224638 4919 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.224669 4919 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224801 4919 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224815 4919 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224824 4919 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224835 4919 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224844 4919 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224855 4919 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224865 4919 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224875 4919 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224883 4919 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224893 4919 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224903 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224911 4919 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224920 4919 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224928 4919 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224936 4919 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224944 4919 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224952 4919 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224959 4919 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224967 4919 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224975 4919 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224983 4919 feature_gate.go:330] unrecognized feature gate: Example Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224991 4919 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.224999 4919 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225007 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225016 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225023 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225031 4919 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225041 4919 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225051 4919 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225062 4919 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225072 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225085 4919 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225098 4919 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225110 4919 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225123 4919 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225135 4919 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225146 4919 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225156 4919 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225168 4919 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225185 4919 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225196 4919 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225207 4919 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225217 4919 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225225 4919 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225234 4919 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225242 4919 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225250 4919 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225258 4919 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225265 4919 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225273 4919 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225280 4919 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225288 4919 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225296 4919 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225303 4919 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225311 4919 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225319 4919 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225330 4919 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225339 4919 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225348 4919 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225356 4919 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225363 4919 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225372 4919 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225382 4919 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225427 4919 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225437 4919 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225445 4919 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225453 4919 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225461 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225469 4919 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225478 4919 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225487 4919 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.225500 4919 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225741 4919 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225753 4919 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225761 4919 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225771 4919 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225784 4919 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225792 4919 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225800 4919 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225807 4919 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225815 4919 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225823 4919 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225831 4919 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225839 4919 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225847 4919 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225855 4919 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225863 4919 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225874 4919 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225883 4919 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225892 4919 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225902 4919 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225914 4919 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225925 4919 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225936 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225946 4919 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225953 4919 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225961 4919 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225970 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225980 4919 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.225990 4919 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226000 4919 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226009 4919 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226018 4919 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226025 4919 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226033 4919 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226041 4919 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226050 4919 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226059 4919 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226067 4919 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226075 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226082 4919 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226090 4919 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226100 4919 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226110 4919 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226120 4919 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226130 4919 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226140 4919 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226150 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226163 4919 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226171 4919 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226179 4919 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226187 4919 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226195 4919 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226203 4919 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226210 4919 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226218 4919 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226226 4919 feature_gate.go:330] unrecognized feature gate: Example Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226233 4919 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226242 4919 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226249 4919 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226257 4919 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226265 4919 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226272 4919 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226281 4919 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226288 4919 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226296 4919 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226303 4919 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226311 4919 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226319 4919 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226327 4919 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226336 4919 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226343 4919 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.226355 4919 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.226366 4919 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.226601 4919 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.231271 4919 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.235840 4919 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.235983 4919 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.240267 4919 server.go:997] "Starting client certificate rotation" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.240312 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.240516 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.269500 4919 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.272717 4919 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.273480 4919 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.292720 4919 log.go:25] "Validated CRI v1 runtime API" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.337778 4919 log.go:25] "Validated CRI v1 image API" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.339892 4919 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.345304 4919 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-21-45-25-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.345347 4919 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.372743 4919 manager.go:217] Machine: {Timestamp:2026-03-10 21:50:13.36916536 +0000 UTC m=+0.611046038 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:eb24d1fd-ecd7-423c-90f7-cacacceb5386 BootID:c22d31cd-a51d-4524-bb69-0b454ae09e98 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c6:a4:29 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c6:a4:29 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:96:2f:22 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:98:0f:9f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:26:da:02 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5b:17:60 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:31:b7:94 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:26:12:89:a0:e8:05 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:02:1e:eb:f5:39:1c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.373109 4919 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.373266 4919 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.374899 4919 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.375234 4919 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.375284 4919 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.375645 4919 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.375665 4919 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.376364 4919 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.376439 4919 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.376647 4919 state_mem.go:36] "Initialized new in-memory state store" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.376793 4919 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.381534 4919 kubelet.go:418] "Attempting to sync node with API server" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.381572 4919 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.381598 4919 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.381619 4919 kubelet.go:324] "Adding apiserver pod source" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.381636 4919 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.387093 4919 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.388617 4919 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.389224 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.389345 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.389452 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.389522 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.391735 4919 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393502 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393547 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393564 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393579 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393603 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393617 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393632 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393659 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393681 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393704 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393730 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.393750 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.395809 4919 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.396707 4919 server.go:1280] "Started kubelet" Mar 10 21:50:13 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.401311 4919 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.401466 4919 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.402055 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.402556 4919 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.405678 4919 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.405743 4919 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.405824 4919 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.405849 4919 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.406053 4919 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.406513 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.407248 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.407374 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.407942 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="200ms" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.408457 4919 factory.go:55] Registering systemd factory Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.408492 4919 factory.go:221] Registration of the systemd container factory successfully Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.418810 4919 server.go:460] "Adding debug handlers to kubelet server" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.419591 4919 factory.go:153] Registering CRI-O factory Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.419639 4919 factory.go:221] Registration of the crio container factory successfully Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.419775 4919 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.419807 4919 factory.go:103] Registering Raw factory Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.419832 4919 manager.go:1196] Started watching for new ooms in manager Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.420729 4919 manager.go:319] Starting recovery of all containers Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.419011 4919 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b9948228b0d69 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,LastTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437145 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437204 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437225 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437240 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437253 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437265 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437276 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437287 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437303 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437318 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437331 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437342 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437353 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437368 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437380 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437416 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437427 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437439 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437452 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437465 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437513 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437525 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437539 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437552 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437565 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437581 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437602 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437784 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437801 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437814 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437826 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437839 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437850 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437863 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437875 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437888 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437900 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437912 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437924 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437938 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437950 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437963 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437974 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437987 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.437999 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438011 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438023 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438036 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438048 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438061 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438073 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438084 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438101 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.438132 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.442068 4919 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.442938 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443038 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443090 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443118 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443151 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443177 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443200 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443230 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443251 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443279 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443302 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443323 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443351 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443373 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443428 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.446062 4919 manager.go:324] Recovery completed Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.443451 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448155 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448182 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448214 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448236 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448257 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448767 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448794 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448949 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448970 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.448991 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449016 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449037 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449063 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449083 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449106 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449141 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449167 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449201 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449228 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449249 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449275 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449298 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449323 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449343 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449363 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449419 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449440 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449469 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449528 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449547 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449572 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449591 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449611 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449637 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449674 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449698 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449727 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449758 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449781 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449806 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449828 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449856 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449882 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449903 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449940 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449962 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.449987 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450143 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450307 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450341 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450368 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450427 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450460 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450491 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450512 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450530 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450555 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450577 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450604 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450621 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450643 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450668 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450688 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450714 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450737 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450765 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450799 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450825 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.450966 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451035 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451053 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451076 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451100 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451121 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451138 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451153 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451172 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451186 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451202 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451279 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451295 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451309 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451329 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451344 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451364 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451378 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451507 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451601 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451677 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451708 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451746 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451775 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451801 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451837 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451865 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451894 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451932 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.451961 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452139 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452174 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452215 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452242 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452278 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452365 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452423 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452454 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452492 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452854 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452882 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452905 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452922 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452940 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452962 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.452976 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453020 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453036 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453053 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453069 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453082 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453101 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453117 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453134 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453148 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453162 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453179 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453192 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453212 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453227 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453250 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453267 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453282 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453299 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453313 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453325 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453344 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453360 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453379 4919 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453408 4919 reconstruct.go:97] "Volume reconstruction finished" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.453418 4919 reconciler.go:26] "Reconciler: start to sync state" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.459990 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.461158 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.461216 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.461236 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.461861 4919 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.461885 4919 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.461913 4919 state_mem.go:36] "Initialized new in-memory state store" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.476783 4919 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.478544 4919 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.478612 4919 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.478710 4919 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.478765 4919 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.483060 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.483219 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.488676 4919 policy_none.go:49] "None policy: Start" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.489843 4919 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.489888 4919 state_mem.go:35] "Initializing new in-memory state store" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.506921 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.548505 4919 manager.go:334] "Starting Device Plugin manager" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.548789 4919 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.548820 4919 server.go:79] "Starting device plugin registration server" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.549313 4919 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.549344 4919 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.550051 4919 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.550253 4919 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.550277 4919 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.564833 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.579054 4919 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.579124 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.580274 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.580324 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.580342 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.580544 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.580783 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.580815 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.581640 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.581676 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.581691 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.581799 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.581923 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.581948 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.582011 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.582062 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.582085 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.582973 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.582993 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.583013 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.583020 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.583031 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.583037 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.583275 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.583511 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.583605 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.585055 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.585101 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.585120 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.585307 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.585582 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.585648 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.586263 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.586298 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.586311 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.587084 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.587115 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.587083 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.587153 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.587166 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.587131 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.587363 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.587426 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.588377 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.588414 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.588424 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.610188 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="400ms" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.649597 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.654162 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.654223 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.654240 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.654277 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.654730 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.654884 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.654999 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.655280 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.655431 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.655570 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.655692 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.655792 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.655898 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.655989 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.655055 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.656361 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.656496 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.656553 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.656593 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.656629 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758319 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758421 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758459 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758490 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758520 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758548 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758575 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758602 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758634 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758667 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758693 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758723 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758754 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758782 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.758810 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759302 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759380 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759413 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759445 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759481 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759486 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759513 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759524 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759582 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759598 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759622 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759637 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759655 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.759673 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.760368 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.857441 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.859821 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.859871 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.859889 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.859925 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:13 crc kubenswrapper[4919]: E0310 21:50:13.860838 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.911824 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.928559 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.950079 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: W0310 21:50:13.971705 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-062caa15664dfd56bd4807463fff2ae53297c7c2a5810e58ab08ded0cfb64317 WatchSource:0}: Error finding container 062caa15664dfd56bd4807463fff2ae53297c7c2a5810e58ab08ded0cfb64317: Status 404 returned error can't find the container with id 062caa15664dfd56bd4807463fff2ae53297c7c2a5810e58ab08ded0cfb64317 Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.977252 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:13 crc kubenswrapper[4919]: I0310 21:50:13.984905 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:14 crc kubenswrapper[4919]: W0310 21:50:14.000440 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f32f3f599ef0bffdfa785123e31ffd36cf44b4dc94972709691af991bbaaf0b5 WatchSource:0}: Error finding container f32f3f599ef0bffdfa785123e31ffd36cf44b4dc94972709691af991bbaaf0b5: Status 404 returned error can't find the container with id f32f3f599ef0bffdfa785123e31ffd36cf44b4dc94972709691af991bbaaf0b5 Mar 10 21:50:14 crc kubenswrapper[4919]: E0310 21:50:14.011188 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="800ms" Mar 10 21:50:14 crc kubenswrapper[4919]: W0310 21:50:14.014999 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b595387e30128a7be277d519952d3479d821541b5a92f8864820b6d0bd68a113 WatchSource:0}: Error finding container b595387e30128a7be277d519952d3479d821541b5a92f8864820b6d0bd68a113: Status 404 returned error can't find the container with id b595387e30128a7be277d519952d3479d821541b5a92f8864820b6d0bd68a113 Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.261569 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.263481 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.263513 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.263524 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.263571 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:14 crc kubenswrapper[4919]: E0310 21:50:14.263977 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.403525 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:14 crc kubenswrapper[4919]: W0310 21:50:14.437659 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:14 crc kubenswrapper[4919]: E0310 21:50:14.437739 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.483362 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b595387e30128a7be277d519952d3479d821541b5a92f8864820b6d0bd68a113"} Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.484787 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f32f3f599ef0bffdfa785123e31ffd36cf44b4dc94972709691af991bbaaf0b5"} Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.486607 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a5e2dd9c762ea428b6047916b832ae73d64ececa0cd9b143db03ac5424180171"} Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.487645 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fd16c0ce7eedc74ed3a4f0446496660cd1eb8a2f149d7d0298c95ad3a516be4e"} Mar 10 21:50:14 crc kubenswrapper[4919]: I0310 21:50:14.488710 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"062caa15664dfd56bd4807463fff2ae53297c7c2a5810e58ab08ded0cfb64317"} Mar 10 21:50:14 crc kubenswrapper[4919]: W0310 21:50:14.697906 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:14 crc kubenswrapper[4919]: E0310 21:50:14.698004 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:14 crc kubenswrapper[4919]: W0310 21:50:14.731034 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:14 crc kubenswrapper[4919]: E0310 21:50:14.731118 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:14 crc kubenswrapper[4919]: E0310 21:50:14.812918 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="1.6s" Mar 10 21:50:14 crc kubenswrapper[4919]: W0310 21:50:14.814725 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:14 crc kubenswrapper[4919]: E0310 21:50:14.814831 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.064596 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.067724 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.067785 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.067803 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.067842 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:15 crc kubenswrapper[4919]: E0310 21:50:15.068502 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.403560 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.406535 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 21:50:15 crc kubenswrapper[4919]: E0310 21:50:15.407572 4919 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.494528 4919 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32" exitCode=0 Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.494608 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32"} Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.494665 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.496199 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.496228 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.496237 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.497281 4919 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1" exitCode=0 Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.497422 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1"} Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.497473 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.498719 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.498739 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.498747 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.504040 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd"} Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.504070 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032"} Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.504081 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aa0e153307a5d1fb56a85cc525ad6ffe2d83bf4e5799981cdeae69f97cfd741e"} Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.504091 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd"} Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.504151 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.505130 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.505225 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.505260 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.506695 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0" exitCode=0 Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.506798 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0"} Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.506854 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.508476 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.508511 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.508527 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.512613 4919 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98" exitCode=0 Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.512666 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98"} Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.512781 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.513891 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.513936 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.513949 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.519351 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.533655 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.533713 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:15 crc kubenswrapper[4919]: I0310 21:50:15.533736 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.403592 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:16 crc kubenswrapper[4919]: E0310 21:50:16.413939 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="3.2s" Mar 10 21:50:16 crc kubenswrapper[4919]: W0310 21:50:16.441547 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:16 crc kubenswrapper[4919]: E0310 21:50:16.441628 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:16 crc kubenswrapper[4919]: W0310 21:50:16.474637 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:16 crc kubenswrapper[4919]: E0310 21:50:16.474725 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.517517 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.517556 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.517566 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.517575 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.518876 4919 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0" exitCode=0 Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.518934 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.519039 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.520306 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.520328 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.520351 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.522699 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.522726 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.523504 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.523536 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.523559 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.524772 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.524807 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.524858 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.524811 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.524973 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5"} Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.525531 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.525564 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.525573 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.525968 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.525993 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.526005 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.670087 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.672200 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.672279 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.672344 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:16 crc kubenswrapper[4919]: I0310 21:50:16.672440 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:16 crc kubenswrapper[4919]: E0310 21:50:16.673293 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 10 21:50:16 crc kubenswrapper[4919]: W0310 21:50:16.778901 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 10 21:50:16 crc kubenswrapper[4919]: E0310 21:50:16.779014 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.123676 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.531565 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"85649c065d26b91ffab0783c520e6cd3e5ac4b8af013b736f7e6cfb7a7c4079c"} Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.531697 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.533018 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.533075 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.533095 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.535051 4919 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6" exitCode=0 Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.535143 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6"} Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.535207 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.535292 4919 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.535293 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.535303 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.535429 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.536445 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.536486 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.536503 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537355 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537374 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537426 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537440 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537370 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537458 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537465 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537444 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:17 crc kubenswrapper[4919]: I0310 21:50:17.537481 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.301455 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.422365 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.542888 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135"} Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.542933 4919 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.542944 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf"} Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.542970 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a"} Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.542978 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.543898 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.543931 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:18 crc kubenswrapper[4919]: I0310 21:50:18.543943 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.078604 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.404044 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.404279 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.410271 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.410326 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.410345 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.551657 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.552453 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.552969 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31"} Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.553020 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e"} Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.553703 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.553751 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.553768 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.554708 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.554791 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.554810 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.762857 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.874042 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.875914 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.875980 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.876005 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:19 crc kubenswrapper[4919]: I0310 21:50:19.876050 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.124719 4919 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.124862 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.554667 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.554695 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.556302 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.556357 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.556376 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.556737 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.556790 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.556812 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.814496 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.814779 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.816307 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.816366 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:20 crc kubenswrapper[4919]: I0310 21:50:20.816384 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:21 crc kubenswrapper[4919]: I0310 21:50:21.963337 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 21:50:21 crc kubenswrapper[4919]: I0310 21:50:21.963675 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:21 crc kubenswrapper[4919]: I0310 21:50:21.965429 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:21 crc kubenswrapper[4919]: I0310 21:50:21.965496 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:21 crc kubenswrapper[4919]: I0310 21:50:21.965517 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:23 crc kubenswrapper[4919]: E0310 21:50:23.565866 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:50:23 crc kubenswrapper[4919]: I0310 21:50:23.677657 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:23 crc kubenswrapper[4919]: I0310 21:50:23.677881 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:23 crc kubenswrapper[4919]: I0310 21:50:23.679584 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:23 crc kubenswrapper[4919]: I0310 21:50:23.679642 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:23 crc kubenswrapper[4919]: I0310 21:50:23.679660 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:23 crc kubenswrapper[4919]: I0310 21:50:23.688267 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.289242 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.289509 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.291299 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.291354 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.291372 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.566795 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.567089 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.568534 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.568613 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.568633 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:24 crc kubenswrapper[4919]: I0310 21:50:24.575559 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:25 crc kubenswrapper[4919]: I0310 21:50:25.570223 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:25 crc kubenswrapper[4919]: I0310 21:50:25.571664 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:25 crc kubenswrapper[4919]: I0310 21:50:25.571750 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:25 crc kubenswrapper[4919]: I0310 21:50:25.571769 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:26 crc kubenswrapper[4919]: I0310 21:50:26.572436 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:26 crc kubenswrapper[4919]: I0310 21:50:26.573814 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:26 crc kubenswrapper[4919]: I0310 21:50:26.573868 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:26 crc kubenswrapper[4919]: I0310 21:50:26.573887 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:27 crc kubenswrapper[4919]: W0310 21:50:27.140422 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 21:50:27 crc kubenswrapper[4919]: I0310 21:50:27.140540 4919 trace.go:236] Trace[620556153]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 21:50:17.138) (total time: 10001ms): Mar 10 21:50:27 crc kubenswrapper[4919]: Trace[620556153]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:50:27.140) Mar 10 21:50:27 crc kubenswrapper[4919]: Trace[620556153]: [10.001822185s] [10.001822185s] END Mar 10 21:50:27 crc kubenswrapper[4919]: E0310 21:50:27.140570 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 21:50:27 crc kubenswrapper[4919]: I0310 21:50:27.404853 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 21:50:28 crc kubenswrapper[4919]: E0310 21:50:28.018275 4919 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b9948228b0d69 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,LastTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:50:28 crc kubenswrapper[4919]: W0310 21:50:28.018886 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z Mar 10 21:50:28 crc kubenswrapper[4919]: E0310 21:50:28.018990 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:28 crc kubenswrapper[4919]: E0310 21:50:28.020355 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 21:50:28 crc kubenswrapper[4919]: W0310 21:50:28.021535 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z Mar 10 21:50:28 crc kubenswrapper[4919]: E0310 21:50:28.021609 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:28 crc kubenswrapper[4919]: E0310 21:50:28.023177 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 21:50:28 crc kubenswrapper[4919]: W0310 21:50:28.026453 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z Mar 10 21:50:28 crc kubenswrapper[4919]: E0310 21:50:28.026544 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.027773 4919 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.027850 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 21:50:28 crc kubenswrapper[4919]: E0310 21:50:28.032121 4919 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.039067 4919 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.039133 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.407452 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:28Z is after 2026-02-23T05:33:13Z Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.432440 4919 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]log ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]etcd ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/priority-and-fairness-filter ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-apiextensions-informers ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-apiextensions-controllers ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/crd-informer-synced ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-system-namespaces-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 10 21:50:28 crc kubenswrapper[4919]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 10 21:50:28 crc kubenswrapper[4919]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/bootstrap-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/start-kube-aggregator-informers ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/apiservice-registration-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/apiservice-discovery-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]autoregister-completion ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/apiservice-openapi-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 10 21:50:28 crc kubenswrapper[4919]: livez check failed Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.432573 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.579953 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.582440 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="85649c065d26b91ffab0783c520e6cd3e5ac4b8af013b736f7e6cfb7a7c4079c" exitCode=255 Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.582535 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"85649c065d26b91ffab0783c520e6cd3e5ac4b8af013b736f7e6cfb7a7c4079c"} Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.582808 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.584025 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.584108 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.584133 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:28 crc kubenswrapper[4919]: I0310 21:50:28.585298 4919 scope.go:117] "RemoveContainer" containerID="85649c065d26b91ffab0783c520e6cd3e5ac4b8af013b736f7e6cfb7a7c4079c" Mar 10 21:50:29 crc kubenswrapper[4919]: I0310 21:50:29.410149 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:29Z is after 2026-02-23T05:33:13Z Mar 10 21:50:29 crc kubenswrapper[4919]: I0310 21:50:29.587855 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 21:50:29 crc kubenswrapper[4919]: I0310 21:50:29.590173 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9"} Mar 10 21:50:29 crc kubenswrapper[4919]: I0310 21:50:29.590523 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:29 crc kubenswrapper[4919]: I0310 21:50:29.591376 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:29 crc kubenswrapper[4919]: I0310 21:50:29.591460 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:29 crc kubenswrapper[4919]: I0310 21:50:29.591479 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.124288 4919 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.124469 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.408069 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:30Z is after 2026-02-23T05:33:13Z Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.595340 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.596157 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.598335 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9" exitCode=255 Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.598419 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9"} Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.598485 4919 scope.go:117] "RemoveContainer" containerID="85649c065d26b91ffab0783c520e6cd3e5ac4b8af013b736f7e6cfb7a7c4079c" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.598676 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.599795 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.599843 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.599861 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:30 crc kubenswrapper[4919]: I0310 21:50:30.600685 4919 scope.go:117] "RemoveContainer" containerID="79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9" Mar 10 21:50:30 crc kubenswrapper[4919]: E0310 21:50:30.600976 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:50:31 crc kubenswrapper[4919]: I0310 21:50:31.407948 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:31Z is after 2026-02-23T05:33:13Z Mar 10 21:50:31 crc kubenswrapper[4919]: I0310 21:50:31.603001 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 21:50:32 crc kubenswrapper[4919]: I0310 21:50:32.405889 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:32Z is after 2026-02-23T05:33:13Z Mar 10 21:50:32 crc kubenswrapper[4919]: W0310 21:50:32.524657 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:32Z is after 2026-02-23T05:33:13Z Mar 10 21:50:32 crc kubenswrapper[4919]: E0310 21:50:32.524763 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:32 crc kubenswrapper[4919]: I0310 21:50:32.849982 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:32 crc kubenswrapper[4919]: I0310 21:50:32.850191 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:32 crc kubenswrapper[4919]: I0310 21:50:32.851794 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:32 crc kubenswrapper[4919]: I0310 21:50:32.851825 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:32 crc kubenswrapper[4919]: I0310 21:50:32.851834 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:32 crc kubenswrapper[4919]: I0310 21:50:32.852281 4919 scope.go:117] "RemoveContainer" containerID="79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9" Mar 10 21:50:32 crc kubenswrapper[4919]: E0310 21:50:32.852439 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:50:33 crc kubenswrapper[4919]: I0310 21:50:33.409538 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:33Z is after 2026-02-23T05:33:13Z Mar 10 21:50:33 crc kubenswrapper[4919]: I0310 21:50:33.428058 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:33 crc kubenswrapper[4919]: E0310 21:50:33.566226 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:50:33 crc kubenswrapper[4919]: I0310 21:50:33.612070 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:33 crc kubenswrapper[4919]: I0310 21:50:33.618352 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:33 crc kubenswrapper[4919]: I0310 21:50:33.618465 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:33 crc kubenswrapper[4919]: I0310 21:50:33.618491 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:33 crc kubenswrapper[4919]: I0310 21:50:33.619495 4919 scope.go:117] "RemoveContainer" containerID="79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9" Mar 10 21:50:33 crc kubenswrapper[4919]: E0310 21:50:33.619844 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:50:33 crc kubenswrapper[4919]: I0310 21:50:33.623672 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.321991 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.323748 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.325958 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.326008 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.326350 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.338833 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.408028 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:34Z is after 2026-02-23T05:33:13Z Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.424018 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:34 crc kubenswrapper[4919]: E0310 21:50:34.426082 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:34Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.426323 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.426370 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.426435 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.426473 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:34 crc kubenswrapper[4919]: E0310 21:50:34.429807 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:34Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.614581 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.615110 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.616042 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.616248 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.616441 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.617168 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.617237 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.617262 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:34 crc kubenswrapper[4919]: I0310 21:50:34.618250 4919 scope.go:117] "RemoveContainer" containerID="79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9" Mar 10 21:50:34 crc kubenswrapper[4919]: E0310 21:50:34.618543 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:50:35 crc kubenswrapper[4919]: W0310 21:50:35.036360 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:35Z is after 2026-02-23T05:33:13Z Mar 10 21:50:35 crc kubenswrapper[4919]: E0310 21:50:35.036492 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:35 crc kubenswrapper[4919]: I0310 21:50:35.408381 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:35Z is after 2026-02-23T05:33:13Z Mar 10 21:50:35 crc kubenswrapper[4919]: W0310 21:50:35.954310 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:35Z is after 2026-02-23T05:33:13Z Mar 10 21:50:35 crc kubenswrapper[4919]: E0310 21:50:35.954466 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:36 crc kubenswrapper[4919]: I0310 21:50:36.405983 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:36Z is after 2026-02-23T05:33:13Z Mar 10 21:50:36 crc kubenswrapper[4919]: I0310 21:50:36.827630 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 21:50:36 crc kubenswrapper[4919]: E0310 21:50:36.832548 4919 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:36 crc kubenswrapper[4919]: W0310 21:50:36.843068 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:36Z is after 2026-02-23T05:33:13Z Mar 10 21:50:36 crc kubenswrapper[4919]: E0310 21:50:36.843163 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:37 crc kubenswrapper[4919]: I0310 21:50:37.406312 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:37Z is after 2026-02-23T05:33:13Z Mar 10 21:50:38 crc kubenswrapper[4919]: E0310 21:50:38.022675 4919 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:38Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b9948228b0d69 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,LastTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:50:38 crc kubenswrapper[4919]: I0310 21:50:38.406058 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:38Z is after 2026-02-23T05:33:13Z Mar 10 21:50:39 crc kubenswrapper[4919]: I0310 21:50:39.079312 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:39 crc kubenswrapper[4919]: I0310 21:50:39.079621 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:39 crc kubenswrapper[4919]: I0310 21:50:39.081122 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:39 crc kubenswrapper[4919]: I0310 21:50:39.081187 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:39 crc kubenswrapper[4919]: I0310 21:50:39.081211 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:39 crc kubenswrapper[4919]: I0310 21:50:39.082231 4919 scope.go:117] "RemoveContainer" containerID="79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9" Mar 10 21:50:39 crc kubenswrapper[4919]: E0310 21:50:39.082676 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:50:39 crc kubenswrapper[4919]: I0310 21:50:39.405137 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:39Z is after 2026-02-23T05:33:13Z Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.124945 4919 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.125030 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.125094 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.125236 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.126280 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.126310 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.126322 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.126760 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"aa0e153307a5d1fb56a85cc525ad6ffe2d83bf4e5799981cdeae69f97cfd741e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.126908 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://aa0e153307a5d1fb56a85cc525ad6ffe2d83bf4e5799981cdeae69f97cfd741e" gracePeriod=30 Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.405158 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:40Z is after 2026-02-23T05:33:13Z Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.631678 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.632225 4919 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="aa0e153307a5d1fb56a85cc525ad6ffe2d83bf4e5799981cdeae69f97cfd741e" exitCode=255 Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.632418 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"aa0e153307a5d1fb56a85cc525ad6ffe2d83bf4e5799981cdeae69f97cfd741e"} Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.632527 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc"} Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.632669 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.634279 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.634469 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.634612 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:40 crc kubenswrapper[4919]: I0310 21:50:40.815113 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.405721 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:41Z is after 2026-02-23T05:33:13Z Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.430365 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:41 crc kubenswrapper[4919]: E0310 21:50:41.430453 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:41Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.431588 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.431629 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.431644 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.431672 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:41 crc kubenswrapper[4919]: E0310 21:50:41.435593 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:41Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.633797 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.634624 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.634648 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:41 crc kubenswrapper[4919]: I0310 21:50:41.634656 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:42 crc kubenswrapper[4919]: I0310 21:50:42.405364 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:42Z is after 2026-02-23T05:33:13Z Mar 10 21:50:42 crc kubenswrapper[4919]: I0310 21:50:42.638736 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:42 crc kubenswrapper[4919]: I0310 21:50:42.639831 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:42 crc kubenswrapper[4919]: I0310 21:50:42.639877 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:42 crc kubenswrapper[4919]: I0310 21:50:42.639893 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:43 crc kubenswrapper[4919]: W0310 21:50:43.287375 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:43Z is after 2026-02-23T05:33:13Z Mar 10 21:50:43 crc kubenswrapper[4919]: E0310 21:50:43.287672 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:43 crc kubenswrapper[4919]: I0310 21:50:43.405359 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:43Z is after 2026-02-23T05:33:13Z Mar 10 21:50:43 crc kubenswrapper[4919]: E0310 21:50:43.566372 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:50:44 crc kubenswrapper[4919]: I0310 21:50:44.405237 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:44Z is after 2026-02-23T05:33:13Z Mar 10 21:50:45 crc kubenswrapper[4919]: I0310 21:50:45.407041 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:45Z is after 2026-02-23T05:33:13Z Mar 10 21:50:46 crc kubenswrapper[4919]: I0310 21:50:46.405713 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:46Z is after 2026-02-23T05:33:13Z Mar 10 21:50:47 crc kubenswrapper[4919]: I0310 21:50:47.123864 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:50:47 crc kubenswrapper[4919]: I0310 21:50:47.124580 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:47 crc kubenswrapper[4919]: I0310 21:50:47.125551 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:47 crc kubenswrapper[4919]: I0310 21:50:47.125584 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:47 crc kubenswrapper[4919]: I0310 21:50:47.125597 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:47 crc kubenswrapper[4919]: I0310 21:50:47.406378 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:47Z is after 2026-02-23T05:33:13Z Mar 10 21:50:48 crc kubenswrapper[4919]: E0310 21:50:48.026506 4919 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:48Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b9948228b0d69 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,LastTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:50:48 crc kubenswrapper[4919]: I0310 21:50:48.405242 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:48Z is after 2026-02-23T05:33:13Z Mar 10 21:50:48 crc kubenswrapper[4919]: I0310 21:50:48.436417 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:48 crc kubenswrapper[4919]: E0310 21:50:48.436480 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:48Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 21:50:48 crc kubenswrapper[4919]: I0310 21:50:48.437516 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:48 crc kubenswrapper[4919]: I0310 21:50:48.437596 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:48 crc kubenswrapper[4919]: I0310 21:50:48.437616 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:48 crc kubenswrapper[4919]: I0310 21:50:48.437651 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:48 crc kubenswrapper[4919]: E0310 21:50:48.440033 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:48Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 21:50:49 crc kubenswrapper[4919]: I0310 21:50:49.405590 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:49Z is after 2026-02-23T05:33:13Z Mar 10 21:50:50 crc kubenswrapper[4919]: I0310 21:50:50.124707 4919 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:50:50 crc kubenswrapper[4919]: I0310 21:50:50.124849 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:50:50 crc kubenswrapper[4919]: I0310 21:50:50.405556 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:50Z is after 2026-02-23T05:33:13Z Mar 10 21:50:51 crc kubenswrapper[4919]: I0310 21:50:51.408149 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:51Z is after 2026-02-23T05:33:13Z Mar 10 21:50:51 crc kubenswrapper[4919]: I0310 21:50:51.479240 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:51 crc kubenswrapper[4919]: I0310 21:50:51.480518 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:51 crc kubenswrapper[4919]: I0310 21:50:51.480783 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:51 crc kubenswrapper[4919]: I0310 21:50:51.480999 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:51 crc kubenswrapper[4919]: I0310 21:50:51.482202 4919 scope.go:117] "RemoveContainer" containerID="79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9" Mar 10 21:50:51 crc kubenswrapper[4919]: W0310 21:50:51.945558 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:51Z is after 2026-02-23T05:33:13Z Mar 10 21:50:51 crc kubenswrapper[4919]: E0310 21:50:51.945659 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.404847 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:52Z is after 2026-02-23T05:33:13Z Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.662440 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.663699 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.665340 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="95d835488672290c01e0cbd49c42bea7f7747e53f3babd69675c60aa2837820b" exitCode=255 Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.665404 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"95d835488672290c01e0cbd49c42bea7f7747e53f3babd69675c60aa2837820b"} Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.665440 4919 scope.go:117] "RemoveContainer" containerID="79a0a667a93c835771975a1105ed442dff0a1d3b4a8c9e0558435f164fb57bf9" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.665583 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.666295 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.666346 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.666366 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.667075 4919 scope.go:117] "RemoveContainer" containerID="95d835488672290c01e0cbd49c42bea7f7747e53f3babd69675c60aa2837820b" Mar 10 21:50:52 crc kubenswrapper[4919]: E0310 21:50:52.667346 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.850525 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:52 crc kubenswrapper[4919]: I0310 21:50:52.935317 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 21:50:52 crc kubenswrapper[4919]: E0310 21:50:52.939616 4919 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:52 crc kubenswrapper[4919]: E0310 21:50:52.940799 4919 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 10 21:50:53 crc kubenswrapper[4919]: I0310 21:50:53.408226 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:53Z is after 2026-02-23T05:33:13Z Mar 10 21:50:53 crc kubenswrapper[4919]: E0310 21:50:53.566707 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:50:53 crc kubenswrapper[4919]: I0310 21:50:53.669225 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 21:50:53 crc kubenswrapper[4919]: I0310 21:50:53.671178 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:53 crc kubenswrapper[4919]: I0310 21:50:53.671947 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:53 crc kubenswrapper[4919]: I0310 21:50:53.671977 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:53 crc kubenswrapper[4919]: I0310 21:50:53.671988 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:53 crc kubenswrapper[4919]: I0310 21:50:53.672538 4919 scope.go:117] "RemoveContainer" containerID="95d835488672290c01e0cbd49c42bea7f7747e53f3babd69675c60aa2837820b" Mar 10 21:50:53 crc kubenswrapper[4919]: E0310 21:50:53.672714 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:50:54 crc kubenswrapper[4919]: I0310 21:50:54.405019 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:54Z is after 2026-02-23T05:33:13Z Mar 10 21:50:55 crc kubenswrapper[4919]: I0310 21:50:55.406580 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:55Z is after 2026-02-23T05:33:13Z Mar 10 21:50:55 crc kubenswrapper[4919]: I0310 21:50:55.440146 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:55 crc kubenswrapper[4919]: E0310 21:50:55.440896 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:55Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 21:50:55 crc kubenswrapper[4919]: I0310 21:50:55.441555 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:55 crc kubenswrapper[4919]: I0310 21:50:55.441586 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:55 crc kubenswrapper[4919]: I0310 21:50:55.441594 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:55 crc kubenswrapper[4919]: I0310 21:50:55.441615 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:50:55 crc kubenswrapper[4919]: E0310 21:50:55.444824 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:55Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 21:50:56 crc kubenswrapper[4919]: I0310 21:50:56.407560 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:56Z is after 2026-02-23T05:33:13Z Mar 10 21:50:56 crc kubenswrapper[4919]: W0310 21:50:56.716270 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:56Z is after 2026-02-23T05:33:13Z Mar 10 21:50:56 crc kubenswrapper[4919]: E0310 21:50:56.716370 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:50:57 crc kubenswrapper[4919]: I0310 21:50:57.406316 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:57Z is after 2026-02-23T05:33:13Z Mar 10 21:50:58 crc kubenswrapper[4919]: E0310 21:50:58.031054 4919 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:58Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b9948228b0d69 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,LastTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:50:58 crc kubenswrapper[4919]: I0310 21:50:58.405558 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:58Z is after 2026-02-23T05:33:13Z Mar 10 21:50:59 crc kubenswrapper[4919]: I0310 21:50:59.078982 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:50:59 crc kubenswrapper[4919]: I0310 21:50:59.079201 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:50:59 crc kubenswrapper[4919]: I0310 21:50:59.080489 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:50:59 crc kubenswrapper[4919]: I0310 21:50:59.080570 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:50:59 crc kubenswrapper[4919]: I0310 21:50:59.080583 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:50:59 crc kubenswrapper[4919]: I0310 21:50:59.081660 4919 scope.go:117] "RemoveContainer" containerID="95d835488672290c01e0cbd49c42bea7f7747e53f3babd69675c60aa2837820b" Mar 10 21:50:59 crc kubenswrapper[4919]: E0310 21:50:59.082045 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:50:59 crc kubenswrapper[4919]: I0310 21:50:59.406267 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:50:59Z is after 2026-02-23T05:33:13Z Mar 10 21:51:00 crc kubenswrapper[4919]: W0310 21:51:00.025051 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:51:00Z is after 2026-02-23T05:33:13Z Mar 10 21:51:00 crc kubenswrapper[4919]: E0310 21:51:00.025120 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:51:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 21:51:00 crc kubenswrapper[4919]: I0310 21:51:00.124866 4919 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:51:00 crc kubenswrapper[4919]: I0310 21:51:00.124990 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:51:00 crc kubenswrapper[4919]: I0310 21:51:00.405814 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:51:00Z is after 2026-02-23T05:33:13Z Mar 10 21:51:01 crc kubenswrapper[4919]: I0310 21:51:01.405668 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:51:01Z is after 2026-02-23T05:33:13Z Mar 10 21:51:02 crc kubenswrapper[4919]: I0310 21:51:02.410705 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:02 crc kubenswrapper[4919]: I0310 21:51:02.445553 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:02 crc kubenswrapper[4919]: I0310 21:51:02.447055 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:02 crc kubenswrapper[4919]: I0310 21:51:02.447103 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:02 crc kubenswrapper[4919]: I0310 21:51:02.447120 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:02 crc kubenswrapper[4919]: I0310 21:51:02.447153 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:51:02 crc kubenswrapper[4919]: E0310 21:51:02.449887 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 21:51:02 crc kubenswrapper[4919]: E0310 21:51:02.450796 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 21:51:03 crc kubenswrapper[4919]: I0310 21:51:03.407676 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:03 crc kubenswrapper[4919]: E0310 21:51:03.567323 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:51:04 crc kubenswrapper[4919]: W0310 21:51:04.248635 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:04 crc kubenswrapper[4919]: E0310 21:51:04.249015 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 21:51:04 crc kubenswrapper[4919]: I0310 21:51:04.409287 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:05 crc kubenswrapper[4919]: I0310 21:51:05.409311 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:06 crc kubenswrapper[4919]: I0310 21:51:06.409348 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:07 crc kubenswrapper[4919]: I0310 21:51:07.410476 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.039440 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b9948228b0d69 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,LastTimestamp:2026-03-10 21:50:13.396655465 +0000 UTC m=+0.638536113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.047677 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482663dfc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,LastTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.055689 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826645e28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,LastTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.062583 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826649dac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461245356 +0000 UTC m=+0.703126004,LastTimestamp:2026-03-10 21:50:13.461245356 +0000 UTC m=+0.703126004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.069772 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482bd6f6e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.552625384 +0000 UTC m=+0.794506032,LastTimestamp:2026-03-10 21:50:13.552625384 +0000 UTC m=+0.794506032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.078043 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b99482663dfc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482663dfc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,LastTimestamp:2026-03-10 21:50:13.580312983 +0000 UTC m=+0.822193621,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.085105 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826645e28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826645e28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,LastTimestamp:2026-03-10 21:50:13.580335194 +0000 UTC m=+0.822215812,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.092759 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826649dac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826649dac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461245356 +0000 UTC m=+0.703126004,LastTimestamp:2026-03-10 21:50:13.580350834 +0000 UTC m=+0.822231452,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.100579 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b99482663dfc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482663dfc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,LastTimestamp:2026-03-10 21:50:13.581661079 +0000 UTC m=+0.823541697,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.108782 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826645e28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826645e28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,LastTimestamp:2026-03-10 21:50:13.581685439 +0000 UTC m=+0.823566067,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.113572 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826649dac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826649dac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461245356 +0000 UTC m=+0.703126004,LastTimestamp:2026-03-10 21:50:13.58169838 +0000 UTC m=+0.823578998,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.117140 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b99482663dfc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482663dfc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,LastTimestamp:2026-03-10 21:50:13.582043789 +0000 UTC m=+0.823924437,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.121809 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826645e28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826645e28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,LastTimestamp:2026-03-10 21:50:13.582076849 +0000 UTC m=+0.823957497,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.124312 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826649dac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826649dac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461245356 +0000 UTC m=+0.703126004,LastTimestamp:2026-03-10 21:50:13.58209742 +0000 UTC m=+0.823978068,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.130000 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b99482663dfc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482663dfc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,LastTimestamp:2026-03-10 21:50:13.583000374 +0000 UTC m=+0.824881012,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.135223 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b99482663dfc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482663dfc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,LastTimestamp:2026-03-10 21:50:13.583012134 +0000 UTC m=+0.824892752,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.142897 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826645e28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826645e28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,LastTimestamp:2026-03-10 21:50:13.583024575 +0000 UTC m=+0.824905223,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.150142 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826645e28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826645e28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,LastTimestamp:2026-03-10 21:50:13.583030715 +0000 UTC m=+0.824911333,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.157292 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826649dac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826649dac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461245356 +0000 UTC m=+0.703126004,LastTimestamp:2026-03-10 21:50:13.583041795 +0000 UTC m=+0.824922443,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.168169 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826649dac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826649dac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461245356 +0000 UTC m=+0.703126004,LastTimestamp:2026-03-10 21:50:13.583047155 +0000 UTC m=+0.824927783,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.176894 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b99482663dfc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482663dfc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,LastTimestamp:2026-03-10 21:50:13.585088109 +0000 UTC m=+0.826968747,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.182151 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826645e28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826645e28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,LastTimestamp:2026-03-10 21:50:13.585113679 +0000 UTC m=+0.826994327,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.189424 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826649dac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826649dac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461245356 +0000 UTC m=+0.703126004,LastTimestamp:2026-03-10 21:50:13.58513117 +0000 UTC m=+0.827011818,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.194801 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b99482663dfc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b99482663dfc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461196745 +0000 UTC m=+0.703077393,LastTimestamp:2026-03-10 21:50:13.58629189 +0000 UTC m=+0.828172508,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.201846 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b994826645e28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b994826645e28 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.461229096 +0000 UTC m=+0.703109744,LastTimestamp:2026-03-10 21:50:13.586307301 +0000 UTC m=+0.828187919,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.208080 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b9948453a180f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.978552335 +0000 UTC m=+1.220432943,LastTimestamp:2026-03-10 21:50:13.978552335 +0000 UTC m=+1.220432943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.213525 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b994845420752 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.979072338 +0000 UTC m=+1.220952946,LastTimestamp:2026-03-10 21:50:13.979072338 +0000 UTC m=+1.220952946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.214939 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994845b98879 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:13.986904185 +0000 UTC m=+1.228784833,LastTimestamp:2026-03-10 21:50:13.986904185 +0000 UTC m=+1.228784833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.220630 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b994846ff384d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.008248397 +0000 UTC m=+1.250129005,LastTimestamp:2026-03-10 21:50:14.008248397 +0000 UTC m=+1.250129005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.227371 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b994847babf9f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.020538271 +0000 UTC m=+1.262418919,LastTimestamp:2026-03-10 21:50:14.020538271 +0000 UTC m=+1.262418919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.235281 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b9948691cd97b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.580615547 +0000 UTC m=+1.822496165,LastTimestamp:2026-03-10 21:50:14.580615547 +0000 UTC m=+1.822496165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.239776 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b99486925902a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.581186602 +0000 UTC m=+1.823067220,LastTimestamp:2026-03-10 21:50:14.581186602 +0000 UTC m=+1.823067220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.243640 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948692ee04a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.581796938 +0000 UTC m=+1.823677556,LastTimestamp:2026-03-10 21:50:14.581796938 +0000 UTC m=+1.823677556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.246930 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b99486935966b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.582236779 +0000 UTC m=+1.824117397,LastTimestamp:2026-03-10 21:50:14.582236779 +0000 UTC m=+1.824117397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.249946 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b99486952daa6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.58415479 +0000 UTC m=+1.826035408,LastTimestamp:2026-03-10 21:50:14.58415479 +0000 UTC m=+1.826035408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.252925 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b994869aea47f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.590170239 +0000 UTC m=+1.832050847,LastTimestamp:2026-03-10 21:50:14.590170239 +0000 UTC m=+1.832050847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.256082 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99486a019ea8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.595608232 +0000 UTC m=+1.837488840,LastTimestamp:2026-03-10 21:50:14.595608232 +0000 UTC m=+1.837488840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.259101 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99486a14bfb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.596861875 +0000 UTC m=+1.838742493,LastTimestamp:2026-03-10 21:50:14.596861875 +0000 UTC m=+1.838742493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.264477 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b99486a2bd153 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.598373715 +0000 UTC m=+1.840254333,LastTimestamp:2026-03-10 21:50:14.598373715 +0000 UTC m=+1.840254333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.270957 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b99486a5cf025 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.601592869 +0000 UTC m=+1.843473497,LastTimestamp:2026-03-10 21:50:14.601592869 +0000 UTC m=+1.843473497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.278315 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b99486a73df5b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.603095899 +0000 UTC m=+1.844976527,LastTimestamp:2026-03-10 21:50:14.603095899 +0000 UTC m=+1.844976527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.285548 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99487b68d148 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.887584072 +0000 UTC m=+2.129464720,LastTimestamp:2026-03-10 21:50:14.887584072 +0000 UTC m=+2.129464720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.292538 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99487bf3ad51 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.896684369 +0000 UTC m=+2.138565007,LastTimestamp:2026-03-10 21:50:14.896684369 +0000 UTC m=+2.138565007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.299105 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99487c093d08 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.898097416 +0000 UTC m=+2.139978024,LastTimestamp:2026-03-10 21:50:14.898097416 +0000 UTC m=+2.139978024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.306818 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b994888d39ebd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.112687293 +0000 UTC m=+2.354567911,LastTimestamp:2026-03-10 21:50:15.112687293 +0000 UTC m=+2.354567911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.313988 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b994889f90549 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.131915593 +0000 UTC m=+2.373796241,LastTimestamp:2026-03-10 21:50:15.131915593 +0000 UTC m=+2.373796241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.320337 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99488a0f5dc9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.133380041 +0000 UTC m=+2.375260679,LastTimestamp:2026-03-10 21:50:15.133380041 +0000 UTC m=+2.375260679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.324382 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b9948957b79dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.325014493 +0000 UTC m=+2.566895111,LastTimestamp:2026-03-10 21:50:15.325014493 +0000 UTC m=+2.566895111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.326862 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b994896517451 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.339037777 +0000 UTC m=+2.580918395,LastTimestamp:2026-03-10 21:50:15.339037777 +0000 UTC m=+2.580918395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.331029 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b99489fc31f20 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.497482016 +0000 UTC m=+2.739362654,LastTimestamp:2026-03-10 21:50:15.497482016 +0000 UTC m=+2.739362654,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.334734 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b99489ff4acce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.50072955 +0000 UTC m=+2.742610148,LastTimestamp:2026-03-10 21:50:15.50072955 +0000 UTC m=+2.742610148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.341339 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948a10e7d24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.5191985 +0000 UTC m=+2.761079118,LastTimestamp:2026-03-10 21:50:15.5191985 +0000 UTC m=+2.761079118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.348735 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9948a114348a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.51957313 +0000 UTC m=+2.761453778,LastTimestamp:2026-03-10 21:50:15.51957313 +0000 UTC m=+2.761453778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.356570 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b9948af6ab6ab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.760123563 +0000 UTC m=+3.002004171,LastTimestamp:2026-03-10 21:50:15.760123563 +0000 UTC m=+3.002004171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.364667 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9948b0110784 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.771023236 +0000 UTC m=+3.012903844,LastTimestamp:2026-03-10 21:50:15.771023236 +0000 UTC m=+3.012903844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.372271 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b9948b0311d2b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.773125931 +0000 UTC m=+3.015006539,LastTimestamp:2026-03-10 21:50:15.773125931 +0000 UTC m=+3.015006539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.379651 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b9948b033ba7b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.773297275 +0000 UTC m=+3.015177883,LastTimestamp:2026-03-10 21:50:15.773297275 +0000 UTC m=+3.015177883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.387995 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948b04f1a84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.775091332 +0000 UTC m=+3.016971940,LastTimestamp:2026-03-10 21:50:15.775091332 +0000 UTC m=+3.016971940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.396336 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b9948b0e2e94f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.784778063 +0000 UTC m=+3.026658671,LastTimestamp:2026-03-10 21:50:15.784778063 +0000 UTC m=+3.026658671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.403798 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b9948b0f163e4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.785726948 +0000 UTC m=+3.027607576,LastTimestamp:2026-03-10 21:50:15.785726948 +0000 UTC m=+3.027607576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: I0310 21:51:08.405472 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.411836 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9948b11f31a0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.788728736 +0000 UTC m=+3.030609364,LastTimestamp:2026-03-10 21:50:15.788728736 +0000 UTC m=+3.030609364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.419339 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948b192b929 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.796300073 +0000 UTC m=+3.038180691,LastTimestamp:2026-03-10 21:50:15.796300073 +0000 UTC m=+3.038180691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.426973 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948b1a12658 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.797245528 +0000 UTC m=+3.039126136,LastTimestamp:2026-03-10 21:50:15.797245528 +0000 UTC m=+3.039126136,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.430178 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b9948bae7ddc9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.952874953 +0000 UTC m=+3.194755561,LastTimestamp:2026-03-10 21:50:15.952874953 +0000 UTC m=+3.194755561,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.434105 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b9948bbdb8975 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.968844149 +0000 UTC m=+3.210724757,LastTimestamp:2026-03-10 21:50:15.968844149 +0000 UTC m=+3.210724757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.437450 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b9948bbefb9df openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.970167263 +0000 UTC m=+3.212047871,LastTimestamp:2026-03-10 21:50:15.970167263 +0000 UTC m=+3.212047871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.442287 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948bc11e3af openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.972406191 +0000 UTC m=+3.214286809,LastTimestamp:2026-03-10 21:50:15.972406191 +0000 UTC m=+3.214286809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.444590 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948bd1d7c65 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.989943397 +0000 UTC m=+3.231824005,LastTimestamp:2026-03-10 21:50:15.989943397 +0000 UTC m=+3.231824005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.449582 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948bd3ac878 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:15.991863416 +0000 UTC m=+3.233744024,LastTimestamp:2026-03-10 21:50:15.991863416 +0000 UTC m=+3.233744024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.452344 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948c9a2a7d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.199997397 +0000 UTC m=+3.441878005,LastTimestamp:2026-03-10 21:50:16.199997397 +0000 UTC m=+3.441878005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.457998 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b9948c9b6e7ad openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.201324461 +0000 UTC m=+3.443205069,LastTimestamp:2026-03-10 21:50:16.201324461 +0000 UTC m=+3.443205069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.464981 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b9948cab25d07 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.217804039 +0000 UTC m=+3.459684647,LastTimestamp:2026-03-10 21:50:16.217804039 +0000 UTC m=+3.459684647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.471800 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948cac387f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.218929139 +0000 UTC m=+3.460809747,LastTimestamp:2026-03-10 21:50:16.218929139 +0000 UTC m=+3.460809747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.478181 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948cad571bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.220103099 +0000 UTC m=+3.461983707,LastTimestamp:2026-03-10 21:50:16.220103099 +0000 UTC m=+3.461983707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.486460 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948d69286bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.417044159 +0000 UTC m=+3.658924777,LastTimestamp:2026-03-10 21:50:16.417044159 +0000 UTC m=+3.658924777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.494258 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948d779415c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.432165212 +0000 UTC m=+3.674045830,LastTimestamp:2026-03-10 21:50:16.432165212 +0000 UTC m=+3.674045830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.501764 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948d7959b98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.43402332 +0000 UTC m=+3.675903928,LastTimestamp:2026-03-10 21:50:16.43402332 +0000 UTC m=+3.675903928,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.508769 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9948dccbd9da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.521464282 +0000 UTC m=+3.763344890,LastTimestamp:2026-03-10 21:50:16.521464282 +0000 UTC m=+3.763344890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.518309 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948e29a12b7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.618865335 +0000 UTC m=+3.860745943,LastTimestamp:2026-03-10 21:50:16.618865335 +0000 UTC m=+3.860745943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.526073 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948e3020252 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.625676882 +0000 UTC m=+3.867557490,LastTimestamp:2026-03-10 21:50:16.625676882 +0000 UTC m=+3.867557490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.536691 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9948e6fe713f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.692551999 +0000 UTC m=+3.934432617,LastTimestamp:2026-03-10 21:50:16.692551999 +0000 UTC m=+3.934432617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.542838 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9948e7c678e9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.705661161 +0000 UTC m=+3.947541769,LastTimestamp:2026-03-10 21:50:16.705661161 +0000 UTC m=+3.947541769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.551705 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994919810482 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:17.539970178 +0000 UTC m=+4.781850816,LastTimestamp:2026-03-10 21:50:17.539970178 +0000 UTC m=+4.781850816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.559068 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9949281b9564 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:17.784980836 +0000 UTC m=+5.026861484,LastTimestamp:2026-03-10 21:50:17.784980836 +0000 UTC m=+5.026861484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.565373 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994928f0f046 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:17.79896327 +0000 UTC m=+5.040843918,LastTimestamp:2026-03-10 21:50:17.79896327 +0000 UTC m=+5.040843918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.572919 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994929077440 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:17.800438848 +0000 UTC m=+5.042319496,LastTimestamp:2026-03-10 21:50:17.800438848 +0000 UTC m=+5.042319496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.579343 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994938d08014 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.065272852 +0000 UTC m=+5.307153480,LastTimestamp:2026-03-10 21:50:18.065272852 +0000 UTC m=+5.307153480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.586712 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994939ef9d99 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.084089241 +0000 UTC m=+5.325969859,LastTimestamp:2026-03-10 21:50:18.084089241 +0000 UTC m=+5.325969859,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.593530 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994939fe357a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.085045626 +0000 UTC m=+5.326926244,LastTimestamp:2026-03-10 21:50:18.085045626 +0000 UTC m=+5.326926244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.600329 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9949464ef969 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.291665257 +0000 UTC m=+5.533545855,LastTimestamp:2026-03-10 21:50:18.291665257 +0000 UTC m=+5.533545855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.607270 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994947360d52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.30680917 +0000 UTC m=+5.548689828,LastTimestamp:2026-03-10 21:50:18.30680917 +0000 UTC m=+5.548689828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.614794 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b99494749e77c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.308110204 +0000 UTC m=+5.549990822,LastTimestamp:2026-03-10 21:50:18.308110204 +0000 UTC m=+5.549990822,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.620449 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994954e86098 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.536599704 +0000 UTC m=+5.778480322,LastTimestamp:2026-03-10 21:50:18.536599704 +0000 UTC m=+5.778480322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.626748 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994956199f71 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.556604273 +0000 UTC m=+5.798484891,LastTimestamp:2026-03-10 21:50:18.556604273 +0000 UTC m=+5.798484891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.634201 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b99495628d588 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.55760116 +0000 UTC m=+5.799481778,LastTimestamp:2026-03-10 21:50:18.55760116 +0000 UTC m=+5.799481778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.642512 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b994963172266 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.774544998 +0000 UTC m=+6.016425646,LastTimestamp:2026-03-10 21:50:18.774544998 +0000 UTC m=+6.016425646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.647496 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b9949642aeb2e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:18.792618798 +0000 UTC m=+6.034499446,LastTimestamp:2026-03-10 21:50:18.792618798 +0000 UTC m=+6.034499446,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.655213 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 21:51:08 crc kubenswrapper[4919]: &Event{ObjectMeta:{kube-controller-manager-crc.189b9949b392ab01 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 21:51:08 crc kubenswrapper[4919]: body: Mar 10 21:51:08 crc kubenswrapper[4919]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:20.124818177 +0000 UTC m=+7.366698815,LastTimestamp:2026-03-10 21:50:20.124818177 +0000 UTC m=+7.366698815,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 21:51:08 crc kubenswrapper[4919]: > Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.661735 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b9949b394559e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:20.12492739 +0000 UTC m=+7.366808038,LastTimestamp:2026-03-10 21:50:20.12492739 +0000 UTC m=+7.366808038,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.674946 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 21:51:08 crc kubenswrapper[4919]: &Event{ObjectMeta:{kube-apiserver-crc.189b994b8aa10ad1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 21:51:08 crc kubenswrapper[4919]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 21:51:08 crc kubenswrapper[4919]: Mar 10 21:51:08 crc kubenswrapper[4919]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:28.027828945 +0000 UTC m=+15.269709593,LastTimestamp:2026-03-10 21:50:28.027828945 +0000 UTC m=+15.269709593,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 21:51:08 crc kubenswrapper[4919]: > Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.681577 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b994b8aa1e966 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:28.027885926 +0000 UTC m=+15.269766584,LastTimestamp:2026-03-10 21:50:28.027885926 +0000 UTC m=+15.269766584,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.689045 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b994b8aa10ad1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 21:51:08 crc kubenswrapper[4919]: &Event{ObjectMeta:{kube-apiserver-crc.189b994b8aa10ad1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 21:51:08 crc kubenswrapper[4919]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 21:51:08 crc kubenswrapper[4919]: Mar 10 21:51:08 crc kubenswrapper[4919]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:28.027828945 +0000 UTC m=+15.269709593,LastTimestamp:2026-03-10 21:50:28.039116458 +0000 UTC m=+15.280997076,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 21:51:08 crc kubenswrapper[4919]: > Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.696308 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b994b8aa1e966\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b994b8aa1e966 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:28.027885926 +0000 UTC m=+15.269766584,LastTimestamp:2026-03-10 21:50:28.039156389 +0000 UTC m=+15.281037007,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.704909 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 21:51:08 crc kubenswrapper[4919]: &Event{ObjectMeta:{kube-apiserver-crc.189b994ba2c053a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 10 21:51:08 crc kubenswrapper[4919]: body: [+]ping ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]log ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]etcd ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/priority-and-fairness-filter ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-apiextensions-informers ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-apiextensions-controllers ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/crd-informer-synced ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-system-namespaces-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 10 21:51:08 crc kubenswrapper[4919]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 10 21:51:08 crc kubenswrapper[4919]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/bootstrap-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/start-kube-aggregator-informers ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/apiservice-registration-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/apiservice-discovery-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]autoregister-completion ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/apiservice-openapi-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 10 21:51:08 crc kubenswrapper[4919]: livez check failed Mar 10 21:51:08 crc kubenswrapper[4919]: Mar 10 21:51:08 crc kubenswrapper[4919]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:28.432532384 +0000 UTC m=+15.674413032,LastTimestamp:2026-03-10 21:50:28.432532384 +0000 UTC m=+15.674413032,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 21:51:08 crc kubenswrapper[4919]: > Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.711720 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b994ba2c19d5c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:28.432616796 +0000 UTC m=+15.674497444,LastTimestamp:2026-03-10 21:50:28.432616796 +0000 UTC m=+15.674497444,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.722949 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b9948d7959b98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b9948d7959b98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:16.43402332 +0000 UTC m=+3.675903928,LastTimestamp:2026-03-10 21:50:28.586783184 +0000 UTC m=+15.828663832,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.728956 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 21:51:08 crc kubenswrapper[4919]: &Event{ObjectMeta:{kube-controller-manager-crc.189b994c0798a475 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 21:51:08 crc kubenswrapper[4919]: body: Mar 10 21:51:08 crc kubenswrapper[4919]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:30.124430453 +0000 UTC m=+17.366311101,LastTimestamp:2026-03-10 21:50:30.124430453 +0000 UTC m=+17.366311101,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 21:51:08 crc kubenswrapper[4919]: > Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.733007 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b994c079a72d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:30.124548816 +0000 UTC m=+17.366429464,LastTimestamp:2026-03-10 21:50:30.124548816 +0000 UTC m=+17.366429464,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.740117 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b994c0798a475\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 21:51:08 crc kubenswrapper[4919]: &Event{ObjectMeta:{kube-controller-manager-crc.189b994c0798a475 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 21:51:08 crc kubenswrapper[4919]: body: Mar 10 21:51:08 crc kubenswrapper[4919]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:30.124430453 +0000 UTC m=+17.366311101,LastTimestamp:2026-03-10 21:50:40.125005264 +0000 UTC m=+27.366885872,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 21:51:08 crc kubenswrapper[4919]: > Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.747028 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b994c079a72d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b994c079a72d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:30.124548816 +0000 UTC m=+17.366429464,LastTimestamp:2026-03-10 21:50:40.125058505 +0000 UTC m=+27.366939113,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.751881 4919 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b994e5bca18ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:40.126892203 +0000 UTC m=+27.368772811,LastTimestamp:2026-03-10 21:50:40.126892203 +0000 UTC m=+27.368772811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.756861 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b99486a14bfb3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99486a14bfb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.596861875 +0000 UTC m=+1.838742493,LastTimestamp:2026-03-10 21:50:40.277709053 +0000 UTC m=+27.519589661,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.763441 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b99487b68d148\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99487b68d148 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.887584072 +0000 UTC m=+2.129464720,LastTimestamp:2026-03-10 21:50:40.443550014 +0000 UTC m=+27.685430622,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.770536 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b99487bf3ad51\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b99487bf3ad51 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:14.896684369 +0000 UTC m=+2.138565007,LastTimestamp:2026-03-10 21:50:40.525029922 +0000 UTC m=+27.766910530,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.777928 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b994c0798a475\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 21:51:08 crc kubenswrapper[4919]: &Event{ObjectMeta:{kube-controller-manager-crc.189b994c0798a475 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 21:51:08 crc kubenswrapper[4919]: body: Mar 10 21:51:08 crc kubenswrapper[4919]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:30.124430453 +0000 UTC m=+17.366311101,LastTimestamp:2026-03-10 21:50:50.124809344 +0000 UTC m=+37.366689992,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 21:51:08 crc kubenswrapper[4919]: > Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.782895 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b994c079a72d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b994c079a72d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:30.124548816 +0000 UTC m=+17.366429464,LastTimestamp:2026-03-10 21:50:50.124890976 +0000 UTC m=+37.366771624,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:51:08 crc kubenswrapper[4919]: E0310 21:51:08.790442 4919 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b994c0798a475\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 21:51:08 crc kubenswrapper[4919]: &Event{ObjectMeta:{kube-controller-manager-crc.189b994c0798a475 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 21:51:08 crc kubenswrapper[4919]: body: Mar 10 21:51:08 crc kubenswrapper[4919]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:50:30.124430453 +0000 UTC m=+17.366311101,LastTimestamp:2026-03-10 21:51:00.124929681 +0000 UTC m=+47.366810329,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 21:51:08 crc kubenswrapper[4919]: > Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.409256 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.411013 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.411249 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.413292 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.413351 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.413373 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.450505 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.452748 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.452816 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.452840 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:09 crc kubenswrapper[4919]: I0310 21:51:09.452883 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:51:09 crc kubenswrapper[4919]: E0310 21:51:09.459781 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 21:51:09 crc kubenswrapper[4919]: E0310 21:51:09.460114 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.125331 4919 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.125501 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.125590 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.125823 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.127583 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.127636 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.127657 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.128534 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.128691 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc" gracePeriod=30 Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.407809 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.713654 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.715249 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.715705 4919 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc" exitCode=255 Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.715741 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc"} Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.715771 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5"} Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.715792 4919 scope.go:117] "RemoveContainer" containerID="aa0e153307a5d1fb56a85cc525ad6ffe2d83bf4e5799981cdeae69f97cfd741e" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.715879 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.716707 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.716753 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.716774 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:10 crc kubenswrapper[4919]: I0310 21:51:10.814712 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:51:11 crc kubenswrapper[4919]: I0310 21:51:11.409783 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:11 crc kubenswrapper[4919]: I0310 21:51:11.723425 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 21:51:11 crc kubenswrapper[4919]: I0310 21:51:11.726675 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:11 crc kubenswrapper[4919]: I0310 21:51:11.728060 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:11 crc kubenswrapper[4919]: I0310 21:51:11.728126 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:11 crc kubenswrapper[4919]: I0310 21:51:11.728144 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:12 crc kubenswrapper[4919]: I0310 21:51:12.407566 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:12 crc kubenswrapper[4919]: I0310 21:51:12.730496 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:12 crc kubenswrapper[4919]: I0310 21:51:12.735032 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:12 crc kubenswrapper[4919]: I0310 21:51:12.735117 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:12 crc kubenswrapper[4919]: I0310 21:51:12.735146 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.409273 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.479304 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.480493 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.480561 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.480580 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.481453 4919 scope.go:117] "RemoveContainer" containerID="95d835488672290c01e0cbd49c42bea7f7747e53f3babd69675c60aa2837820b" Mar 10 21:51:13 crc kubenswrapper[4919]: E0310 21:51:13.567746 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.736498 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.739302 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db"} Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.739467 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.740227 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.740268 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:13 crc kubenswrapper[4919]: I0310 21:51:13.740285 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.409410 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.743601 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.744806 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.747230 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" exitCode=255 Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.747292 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db"} Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.747353 4919 scope.go:117] "RemoveContainer" containerID="95d835488672290c01e0cbd49c42bea7f7747e53f3babd69675c60aa2837820b" Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.747518 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.748763 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.748806 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.748820 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:14 crc kubenswrapper[4919]: I0310 21:51:14.749487 4919 scope.go:117] "RemoveContainer" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" Mar 10 21:51:14 crc kubenswrapper[4919]: E0310 21:51:14.749678 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:51:15 crc kubenswrapper[4919]: I0310 21:51:15.408122 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:15 crc kubenswrapper[4919]: I0310 21:51:15.750270 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 21:51:16 crc kubenswrapper[4919]: I0310 21:51:16.408145 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:16 crc kubenswrapper[4919]: I0310 21:51:16.460340 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:16 crc kubenswrapper[4919]: I0310 21:51:16.461562 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:16 crc kubenswrapper[4919]: I0310 21:51:16.461605 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:16 crc kubenswrapper[4919]: I0310 21:51:16.461618 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:16 crc kubenswrapper[4919]: I0310 21:51:16.461643 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:51:16 crc kubenswrapper[4919]: E0310 21:51:16.468357 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 21:51:16 crc kubenswrapper[4919]: E0310 21:51:16.469181 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.123921 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.124091 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.125012 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.125044 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.125053 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.129100 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.407034 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.756384 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.757543 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.757661 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:17 crc kubenswrapper[4919]: I0310 21:51:17.757730 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:18 crc kubenswrapper[4919]: I0310 21:51:18.409820 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:19 crc kubenswrapper[4919]: I0310 21:51:19.079616 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:51:19 crc kubenswrapper[4919]: I0310 21:51:19.079820 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:19 crc kubenswrapper[4919]: I0310 21:51:19.080970 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:19 crc kubenswrapper[4919]: I0310 21:51:19.081070 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:19 crc kubenswrapper[4919]: I0310 21:51:19.081257 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:19 crc kubenswrapper[4919]: I0310 21:51:19.081859 4919 scope.go:117] "RemoveContainer" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" Mar 10 21:51:19 crc kubenswrapper[4919]: E0310 21:51:19.082207 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:51:19 crc kubenswrapper[4919]: W0310 21:51:19.180752 4919 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 21:51:19 crc kubenswrapper[4919]: E0310 21:51:19.180795 4919 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 21:51:19 crc kubenswrapper[4919]: I0310 21:51:19.407969 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:20 crc kubenswrapper[4919]: I0310 21:51:20.406167 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:20 crc kubenswrapper[4919]: I0310 21:51:20.820913 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:51:20 crc kubenswrapper[4919]: I0310 21:51:20.821068 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:20 crc kubenswrapper[4919]: I0310 21:51:20.823471 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:20 crc kubenswrapper[4919]: I0310 21:51:20.823729 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:20 crc kubenswrapper[4919]: I0310 21:51:20.823879 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:21 crc kubenswrapper[4919]: I0310 21:51:21.409662 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:22 crc kubenswrapper[4919]: I0310 21:51:22.408319 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:22 crc kubenswrapper[4919]: I0310 21:51:22.849687 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:51:22 crc kubenswrapper[4919]: I0310 21:51:22.849852 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:22 crc kubenswrapper[4919]: I0310 21:51:22.850969 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:22 crc kubenswrapper[4919]: I0310 21:51:22.851024 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:22 crc kubenswrapper[4919]: I0310 21:51:22.851044 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:22 crc kubenswrapper[4919]: I0310 21:51:22.851911 4919 scope.go:117] "RemoveContainer" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" Mar 10 21:51:22 crc kubenswrapper[4919]: E0310 21:51:22.852189 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:51:23 crc kubenswrapper[4919]: I0310 21:51:23.407168 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:23 crc kubenswrapper[4919]: I0310 21:51:23.468612 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:23 crc kubenswrapper[4919]: I0310 21:51:23.470382 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:23 crc kubenswrapper[4919]: I0310 21:51:23.470480 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:23 crc kubenswrapper[4919]: I0310 21:51:23.471731 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:23 crc kubenswrapper[4919]: I0310 21:51:23.471983 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:51:23 crc kubenswrapper[4919]: E0310 21:51:23.476331 4919 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 21:51:23 crc kubenswrapper[4919]: E0310 21:51:23.480267 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 21:51:23 crc kubenswrapper[4919]: E0310 21:51:23.567843 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:51:24 crc kubenswrapper[4919]: I0310 21:51:24.410219 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:24 crc kubenswrapper[4919]: I0310 21:51:24.942009 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 21:51:24 crc kubenswrapper[4919]: I0310 21:51:24.962988 4919 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 21:51:25 crc kubenswrapper[4919]: I0310 21:51:25.410109 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:26 crc kubenswrapper[4919]: I0310 21:51:26.408926 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:27 crc kubenswrapper[4919]: I0310 21:51:27.409931 4919 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 21:51:28 crc kubenswrapper[4919]: I0310 21:51:28.369075 4919 csr.go:261] certificate signing request csr-jmdxx is approved, waiting to be issued Mar 10 21:51:28 crc kubenswrapper[4919]: I0310 21:51:28.381197 4919 csr.go:257] certificate signing request csr-jmdxx is issued Mar 10 21:51:28 crc kubenswrapper[4919]: I0310 21:51:28.414954 4919 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 21:51:29 crc kubenswrapper[4919]: I0310 21:51:29.240107 4919 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 21:51:29 crc kubenswrapper[4919]: I0310 21:51:29.382687 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-17 16:27:08.473993915 +0000 UTC Mar 10 21:51:29 crc kubenswrapper[4919]: I0310 21:51:29.382749 4919 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6762h35m39.091249529s for next certificate rotation Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.477307 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.479164 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.479215 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.479233 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.479345 4919 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.496483 4919 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.496836 4919 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.496872 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.501472 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.501523 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.501541 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.501566 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.501584 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:30Z","lastTransitionTime":"2026-03-10T21:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.521273 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.531812 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.531873 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.531890 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.531916 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.531938 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:30Z","lastTransitionTime":"2026-03-10T21:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.549199 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.565690 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.565772 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.565796 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.565829 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.565851 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:30Z","lastTransitionTime":"2026-03-10T21:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.585298 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.597618 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.597694 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.597719 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.597749 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:30 crc kubenswrapper[4919]: I0310 21:51:30.597773 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:30Z","lastTransitionTime":"2026-03-10T21:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.615667 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.615907 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.615949 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.716649 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.817500 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:30 crc kubenswrapper[4919]: E0310 21:51:30.918685 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.019523 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.120162 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.221079 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.321634 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.422190 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.523256 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.624779 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.725966 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.827196 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:31 crc kubenswrapper[4919]: E0310 21:51:31.927595 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.028793 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.129929 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.231128 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.331880 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.433001 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.533771 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.634708 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.735475 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.836037 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:32 crc kubenswrapper[4919]: E0310 21:51:32.937013 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.037225 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.137870 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.238251 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.338352 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.439503 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.540375 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.567945 4919 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.641511 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.742099 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.842789 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:33 crc kubenswrapper[4919]: E0310 21:51:33.944247 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.044371 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.145423 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.246571 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.347458 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.448157 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.548881 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.649249 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.750268 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.850873 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:34 crc kubenswrapper[4919]: E0310 21:51:34.951539 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.051919 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.153004 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.253806 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.354576 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.454780 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.555807 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.656308 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.756679 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.857487 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:35 crc kubenswrapper[4919]: E0310 21:51:35.958483 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.059024 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.159182 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.259516 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.360135 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.460344 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: I0310 21:51:36.480076 4919 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 21:51:36 crc kubenswrapper[4919]: I0310 21:51:36.482207 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:36 crc kubenswrapper[4919]: I0310 21:51:36.482258 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:36 crc kubenswrapper[4919]: I0310 21:51:36.482275 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:36 crc kubenswrapper[4919]: I0310 21:51:36.483235 4919 scope.go:117] "RemoveContainer" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.483557 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:51:36 crc kubenswrapper[4919]: I0310 21:51:36.556025 4919 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.561175 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.661689 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.762187 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.862874 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:36 crc kubenswrapper[4919]: E0310 21:51:36.963285 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.063965 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.164521 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.265289 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.365817 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.465948 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.566136 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.666639 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.767449 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.868489 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:37 crc kubenswrapper[4919]: E0310 21:51:37.969056 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.069560 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.169725 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.252267 4919 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.270912 4919 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.286866 4919 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.373461 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.373852 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.373990 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.374125 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.374241 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:38Z","lastTransitionTime":"2026-03-10T21:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.428564 4919 apiserver.go:52] "Watching apiserver" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.435448 4919 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.435900 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.438073 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.438166 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.438645 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.440220 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.440603 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.440370 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.441325 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.441340 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.441567 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.444805 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.445322 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.445880 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.445746 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.445704 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.446234 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.446273 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.446299 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.446963 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.477615 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.477854 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.478041 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.478183 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.478349 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:38Z","lastTransitionTime":"2026-03-10T21:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.491707 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.507129 4919 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.509746 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.531558 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.547936 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.564680 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576671 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576735 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576775 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576810 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576846 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576878 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576909 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576943 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.576975 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577006 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577054 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577092 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577125 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577157 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577189 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577221 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577253 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577292 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577323 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577354 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577385 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577445 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577480 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577512 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577543 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577573 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577606 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577639 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577669 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577700 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577730 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577762 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577829 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577862 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577902 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577935 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577965 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.577996 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578035 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578081 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578115 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578148 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578180 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578211 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578242 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578274 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578306 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578338 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578369 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578427 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578460 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578493 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578526 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578557 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578588 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578619 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578651 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578683 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578841 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578876 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578908 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578940 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.578970 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579000 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579034 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579080 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579092 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579119 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579153 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579189 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579222 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579255 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579287 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579317 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579351 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579416 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579450 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579458 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579486 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579519 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579594 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579631 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579664 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579696 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579730 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579762 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579794 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579826 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579859 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579890 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579923 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579957 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.579990 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580020 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580053 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580085 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580118 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580154 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580187 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580219 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580257 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580289 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580340 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580376 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580427 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580437 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580476 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580510 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580544 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580576 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580611 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580645 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580677 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580708 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580740 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580773 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580816 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580852 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580899 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580934 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580967 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.580999 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581032 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581067 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581098 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581133 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581166 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581200 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581235 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581267 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581300 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581332 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581368 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581575 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.581980 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582087 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582292 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582341 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582418 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582454 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582489 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582523 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582549 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582560 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582666 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582677 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582748 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582802 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582843 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582859 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582854 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582891 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.582972 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583067 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583104 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583148 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583223 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583268 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583311 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583330 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583354 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583558 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583850 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.583911 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.584054 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.584083 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.584298 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.584517 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.584585 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.584788 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.584846 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585000 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585057 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585113 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585183 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585239 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585292 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585351 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585442 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585462 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585498 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585502 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585566 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585625 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585679 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585733 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585783 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585891 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585951 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.586010 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.586066 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.586122 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.584598 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585098 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585088 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585315 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585576 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585814 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.586146 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.586153 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.587377 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.587448 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.587875 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.587907 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.588172 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.588561 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589055 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589270 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.586179 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589448 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589496 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.585514 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589582 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589657 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:38Z","lastTransitionTime":"2026-03-10T21:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589186 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589504 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589511 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589564 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590171 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590166 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.589533 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590369 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590374 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590501 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590570 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590586 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590643 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590695 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590751 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590755 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590799 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590854 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590912 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.590971 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591005 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591006 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591025 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591199 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591293 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591353 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591420 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591457 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591494 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591530 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591563 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591597 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591631 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591667 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591743 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591786 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591822 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591863 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591902 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591940 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591979 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592018 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592064 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592103 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592138 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592177 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592214 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592252 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592340 4919 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592364 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592387 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592436 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592457 4919 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592478 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592499 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592521 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592541 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592562 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592582 4919 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592602 4919 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592624 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592644 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592666 4919 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592686 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592706 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592726 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592747 4919 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592766 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592786 4919 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592806 4919 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592827 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592964 4919 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592986 4919 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593005 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593026 4919 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593045 4919 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593066 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593085 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593107 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593125 4919 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593145 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593165 4919 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593184 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593204 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593224 4919 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593245 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593264 4919 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593283 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593304 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593324 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593344 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593364 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593384 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593428 4919 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591299 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.591461 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592347 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592655 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.594989 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.592840 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593579 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593605 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593612 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.593952 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.594155 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.594792 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.595316 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.595541 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.595632 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.595827 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.595943 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.596127 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.596219 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.596245 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.596271 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.596294 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.596524 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.596549 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.597140 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.597361 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.597428 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.598009 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.598204 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.598603 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.598672 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.600672 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.600764 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.600915 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.601055 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.601066 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.601162 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.601318 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.601448 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.601507 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.602104 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.602182 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.602554 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.602688 4919 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.603299 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.603445 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.603749 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.604036 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.604039 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.604119 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:39.104092737 +0000 UTC m=+86.345973425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.604634 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.604839 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.604891 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.605091 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.606023 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.606119 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.606727 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.607065 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.611106 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.611830 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.615687 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.616180 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.616423 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.616666 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.616818 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.617063 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.617078 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.617090 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.617142 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:39.117124459 +0000 UTC m=+86.359005067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.617661 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.617720 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.617957 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.617960 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.621852 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.621953 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:39.121907385 +0000 UTC m=+86.363788003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.622207 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.622682 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.622833 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.622892 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.622930 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.622906 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.623098 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.623223 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.623229 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.623347 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.623551 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.623525 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.623846 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.623858 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.624148 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.624415 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.624425 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.624872 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.628507 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:51:39.128482403 +0000 UTC m=+86.370363021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.629453 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.629507 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.630035 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.630389 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.631010 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.630928 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.631054 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.631735 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.631858 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.631927 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.632845 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.633770 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634015 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.634088 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.634149 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.634171 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634244 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.634276 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:39.134251397 +0000 UTC m=+86.376132035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634300 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634474 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634510 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634577 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634591 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634613 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634596 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634717 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.634764 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.635110 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.635414 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.635455 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.637665 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.637782 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.638099 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.638160 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.638564 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.638501 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.640260 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.640749 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.640845 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.641063 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.641684 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.642357 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.642677 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.643612 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.643773 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.644000 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.644113 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.644630 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.644714 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.644821 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.644842 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.644998 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.645269 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.646371 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.646553 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.646645 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.647053 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.647225 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.647306 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.647383 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.647886 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.648016 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.648254 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.648474 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.650720 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.650945 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.650976 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.651035 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.651354 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.652349 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.652637 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.652745 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.652972 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.661313 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.663511 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.668045 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.692926 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693161 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693303 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693476 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693731 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693763 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693812 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693824 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693832 4919 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693843 4919 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693853 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693861 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693870 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693878 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693886 4919 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693894 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693903 4919 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693910 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693920 4919 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693929 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693938 4919 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693946 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693956 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693964 4919 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693972 4919 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693981 4919 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693990 4919 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693999 4919 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694009 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694018 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.693720 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:38Z","lastTransitionTime":"2026-03-10T21:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694083 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694013 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694028 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694188 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694207 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694219 4919 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694231 4919 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694242 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694254 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694266 4919 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694277 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694289 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694301 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694312 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694324 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694335 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694347 4919 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694359 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694370 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694381 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694455 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694469 4919 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694480 4919 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694491 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694502 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694514 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694525 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694537 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694550 4919 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694563 4919 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694574 4919 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694584 4919 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694596 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694607 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694619 4919 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694630 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694641 4919 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694652 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694663 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694674 4919 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694685 4919 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694697 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694708 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694720 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694733 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694744 4919 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694755 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694766 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694778 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694790 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694802 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694814 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694826 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694837 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694849 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694860 4919 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694871 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694883 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694894 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694905 4919 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694917 4919 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694928 4919 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694939 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694951 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694963 4919 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694974 4919 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694985 4919 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.694997 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695009 4919 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695021 4919 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695033 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695044 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695056 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695067 4919 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695078 4919 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695089 4919 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695100 4919 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695111 4919 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695405 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695480 4919 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695494 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695508 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695519 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695534 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695545 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695557 4919 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695569 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695580 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695592 4919 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695604 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695616 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695629 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695640 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695651 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695662 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695673 4919 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695686 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695698 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695709 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695721 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695734 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695746 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695758 4919 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695769 4919 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695780 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695791 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695802 4919 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695813 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695825 4919 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695836 4919 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695847 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695858 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695870 4919 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695881 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695893 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695904 4919 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695915 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695926 4919 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695939 4919 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695951 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695963 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695975 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695986 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.695998 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.696009 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.696020 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.765830 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.780879 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.788751 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:38 crc kubenswrapper[4919]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 21:51:38 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:38 crc kubenswrapper[4919]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 21:51:38 crc kubenswrapper[4919]: source /etc/kubernetes/apiserver-url.env Mar 10 21:51:38 crc kubenswrapper[4919]: else Mar 10 21:51:38 crc kubenswrapper[4919]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 21:51:38 crc kubenswrapper[4919]: exit 1 Mar 10 21:51:38 crc kubenswrapper[4919]: fi Mar 10 21:51:38 crc kubenswrapper[4919]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 21:51:38 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:38 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.789945 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.792528 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.796619 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.796780 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.796799 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.796818 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.796877 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:38Z","lastTransitionTime":"2026-03-10T21:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.803320 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.804645 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.811538 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"95cb12fa1d29298a56ba263783b7942797dd7765399c92e8d78306117d333ae0"} Mar 10 21:51:38 crc kubenswrapper[4919]: W0310 21:51:38.812636 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-48b4734c7170a05ceeb58a99ad16e7b4fe86603445578a10dcacb07b02fb32a8 WatchSource:0}: Error finding container 48b4734c7170a05ceeb58a99ad16e7b4fe86603445578a10dcacb07b02fb32a8: Status 404 returned error can't find the container with id 48b4734c7170a05ceeb58a99ad16e7b4fe86603445578a10dcacb07b02fb32a8 Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.813730 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6beaae2cec2ab55604c641ed4073d842e91e659eb4e421de8ce26eb074518500"} Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.815187 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.816245 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:38 crc kubenswrapper[4919]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 21:51:38 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:38 crc kubenswrapper[4919]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 21:51:38 crc kubenswrapper[4919]: source /etc/kubernetes/apiserver-url.env Mar 10 21:51:38 crc kubenswrapper[4919]: else Mar 10 21:51:38 crc kubenswrapper[4919]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 21:51:38 crc kubenswrapper[4919]: exit 1 Mar 10 21:51:38 crc kubenswrapper[4919]: fi Mar 10 21:51:38 crc kubenswrapper[4919]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 21:51:38 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:38 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.816321 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.817129 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:38 crc kubenswrapper[4919]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 21:51:38 crc kubenswrapper[4919]: if [[ -f "/env/_master" ]]; then Mar 10 21:51:38 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:38 crc kubenswrapper[4919]: source "/env/_master" Mar 10 21:51:38 crc kubenswrapper[4919]: set +o allexport Mar 10 21:51:38 crc kubenswrapper[4919]: fi Mar 10 21:51:38 crc kubenswrapper[4919]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 21:51:38 crc kubenswrapper[4919]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 21:51:38 crc kubenswrapper[4919]: ho_enable="--enable-hybrid-overlay" Mar 10 21:51:38 crc kubenswrapper[4919]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 21:51:38 crc kubenswrapper[4919]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 21:51:38 crc kubenswrapper[4919]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 21:51:38 crc kubenswrapper[4919]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 21:51:38 crc kubenswrapper[4919]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 21:51:38 crc kubenswrapper[4919]: --webhook-host=127.0.0.1 \ Mar 10 21:51:38 crc kubenswrapper[4919]: --webhook-port=9743 \ Mar 10 21:51:38 crc kubenswrapper[4919]: ${ho_enable} \ Mar 10 21:51:38 crc kubenswrapper[4919]: --enable-interconnect \ Mar 10 21:51:38 crc kubenswrapper[4919]: --disable-approver \ Mar 10 21:51:38 crc kubenswrapper[4919]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 21:51:38 crc kubenswrapper[4919]: --wait-for-kubernetes-api=200s \ Mar 10 21:51:38 crc kubenswrapper[4919]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 21:51:38 crc kubenswrapper[4919]: --loglevel="${LOGLEVEL}" Mar 10 21:51:38 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:38 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.817863 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.821141 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:38 crc kubenswrapper[4919]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 21:51:38 crc kubenswrapper[4919]: if [[ -f "/env/_master" ]]; then Mar 10 21:51:38 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:38 crc kubenswrapper[4919]: source "/env/_master" Mar 10 21:51:38 crc kubenswrapper[4919]: set +o allexport Mar 10 21:51:38 crc kubenswrapper[4919]: fi Mar 10 21:51:38 crc kubenswrapper[4919]: Mar 10 21:51:38 crc kubenswrapper[4919]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 21:51:38 crc kubenswrapper[4919]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 21:51:38 crc kubenswrapper[4919]: --disable-webhook \ Mar 10 21:51:38 crc kubenswrapper[4919]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 21:51:38 crc kubenswrapper[4919]: --loglevel="${LOGLEVEL}" Mar 10 21:51:38 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:38 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.821758 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: E0310 21:51:38.824595 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.834868 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.846113 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.860017 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.874997 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.886455 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.897944 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.900629 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.900658 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.900695 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.900714 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.900726 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:38Z","lastTransitionTime":"2026-03-10T21:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.909069 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.924141 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.936093 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.948361 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:38 crc kubenswrapper[4919]: I0310 21:51:38.964316 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.004284 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.004322 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.004336 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.004354 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.004370 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.107119 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.107152 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.107162 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.107177 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.107215 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.201959 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.202067 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.202110 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.202142 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202190 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:51:40.202154842 +0000 UTC m=+87.444035490 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202230 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.202248 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202287 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:40.202269785 +0000 UTC m=+87.444150433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202378 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202386 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202459 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202484 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202530 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:40.202505012 +0000 UTC m=+87.444385670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202564 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:40.202546723 +0000 UTC m=+87.444427371 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202579 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202686 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202756 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.202917 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:40.202882162 +0000 UTC m=+87.444762810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.211363 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.211429 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.211448 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.211471 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.211487 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.314088 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.314149 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.314157 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.314170 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.314180 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.417650 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.417742 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.417790 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.417829 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.417854 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.485685 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.486422 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.487360 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.488824 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.490004 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.491094 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.491856 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.492569 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.493855 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.494591 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.495698 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.496710 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.497835 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.498599 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.499701 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.500312 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.500903 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.501758 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.502307 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.502831 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.503665 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.504203 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.505049 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.505802 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.506662 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.507385 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.508546 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.509026 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.509600 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.510422 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.510849 4919 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.510944 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.513319 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.514050 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.514654 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.516546 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.517417 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.518117 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.518982 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.521376 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.521459 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.521477 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.521503 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.521520 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.522374 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.522998 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.524381 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.525193 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.526541 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.527202 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.528567 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.529259 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.530764 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.531423 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.532539 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.533133 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.534348 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.535109 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.535809 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.624305 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.624366 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.624418 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.624452 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.624479 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.727789 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.727861 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.727884 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.727912 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.727931 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.817451 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"48b4734c7170a05ceeb58a99ad16e7b4fe86603445578a10dcacb07b02fb32a8"} Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.820048 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:39 crc kubenswrapper[4919]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 21:51:39 crc kubenswrapper[4919]: if [[ -f "/env/_master" ]]; then Mar 10 21:51:39 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:39 crc kubenswrapper[4919]: source "/env/_master" Mar 10 21:51:39 crc kubenswrapper[4919]: set +o allexport Mar 10 21:51:39 crc kubenswrapper[4919]: fi Mar 10 21:51:39 crc kubenswrapper[4919]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 21:51:39 crc kubenswrapper[4919]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 21:51:39 crc kubenswrapper[4919]: ho_enable="--enable-hybrid-overlay" Mar 10 21:51:39 crc kubenswrapper[4919]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 21:51:39 crc kubenswrapper[4919]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 21:51:39 crc kubenswrapper[4919]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 21:51:39 crc kubenswrapper[4919]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 21:51:39 crc kubenswrapper[4919]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 21:51:39 crc kubenswrapper[4919]: --webhook-host=127.0.0.1 \ Mar 10 21:51:39 crc kubenswrapper[4919]: --webhook-port=9743 \ Mar 10 21:51:39 crc kubenswrapper[4919]: ${ho_enable} \ Mar 10 21:51:39 crc kubenswrapper[4919]: --enable-interconnect \ Mar 10 21:51:39 crc kubenswrapper[4919]: --disable-approver \ Mar 10 21:51:39 crc kubenswrapper[4919]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 21:51:39 crc kubenswrapper[4919]: --wait-for-kubernetes-api=200s \ Mar 10 21:51:39 crc kubenswrapper[4919]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 21:51:39 crc kubenswrapper[4919]: --loglevel="${LOGLEVEL}" Mar 10 21:51:39 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:39 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.823309 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:39 crc kubenswrapper[4919]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 21:51:39 crc kubenswrapper[4919]: if [[ -f "/env/_master" ]]; then Mar 10 21:51:39 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:39 crc kubenswrapper[4919]: source "/env/_master" Mar 10 21:51:39 crc kubenswrapper[4919]: set +o allexport Mar 10 21:51:39 crc kubenswrapper[4919]: fi Mar 10 21:51:39 crc kubenswrapper[4919]: Mar 10 21:51:39 crc kubenswrapper[4919]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 21:51:39 crc kubenswrapper[4919]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 21:51:39 crc kubenswrapper[4919]: --disable-webhook \ Mar 10 21:51:39 crc kubenswrapper[4919]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 21:51:39 crc kubenswrapper[4919]: --loglevel="${LOGLEVEL}" Mar 10 21:51:39 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:39 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:39 crc kubenswrapper[4919]: E0310 21:51:39.825532 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.830756 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.830973 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.830998 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.831072 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.831092 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.837035 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.854031 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.871004 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.885503 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.899273 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.913384 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.933644 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.933704 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.933724 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.933748 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:39 crc kubenswrapper[4919]: I0310 21:51:39.933765 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:39Z","lastTransitionTime":"2026-03-10T21:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.036517 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.036848 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.036996 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.037141 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.037291 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.140771 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.141139 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.141306 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.141496 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.141628 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.212538 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.212684 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.212752 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.212828 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.212866 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.213009 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.213082 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:42.213059755 +0000 UTC m=+89.454940403 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.213549 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.213647 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:51:42.21359844 +0000 UTC m=+89.455479088 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.213698 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:42.213683432 +0000 UTC m=+89.455564070 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.213891 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.214039 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.214213 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.214438 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:42.214369192 +0000 UTC m=+89.456249830 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.214649 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.214776 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.214901 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.215079 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:42.215060282 +0000 UTC m=+89.456940920 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.244465 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.244646 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.244658 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.244700 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.244713 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.348061 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.348428 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.348631 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.348814 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.348940 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.452210 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.452538 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.452746 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.452892 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.453013 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.479105 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.479559 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.480203 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.480492 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.480691 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.480888 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.556216 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.556543 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.556849 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.557178 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.557513 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.660218 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.661296 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.661678 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.662048 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.662432 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.766468 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.766683 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.767053 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.767279 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.767533 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.798658 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.798850 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.798972 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.799106 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.799225 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.816761 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.823786 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.824046 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.824217 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.824385 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.824590 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.841829 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.848445 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.848682 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.848899 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.849049 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.849179 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.866519 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.871992 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.872197 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.872424 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.872594 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.873036 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.890132 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.895249 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.895487 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.895659 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.895804 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.895926 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.915518 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:40 crc kubenswrapper[4919]: E0310 21:51:40.915832 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.918778 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.918841 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.918859 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.918890 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:40 crc kubenswrapper[4919]: I0310 21:51:40.918909 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:40Z","lastTransitionTime":"2026-03-10T21:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.022607 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.023005 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.023170 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.023327 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.023527 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.126639 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.127135 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.127448 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.127705 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.127908 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.231925 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.232355 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.232652 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.232893 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.233109 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.344146 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.344647 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.344822 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.344986 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.345138 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.448048 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.448102 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.448119 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.448142 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.448163 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.551015 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.551081 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.551099 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.551125 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.551143 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.654900 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.654967 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.654984 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.655011 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.655029 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.757808 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.757878 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.757901 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.757935 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.757959 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.860275 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.860345 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.860364 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.860387 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.860470 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.962870 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.962924 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.962941 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.962965 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:41 crc kubenswrapper[4919]: I0310 21:51:41.962983 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:41Z","lastTransitionTime":"2026-03-10T21:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.065071 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.065121 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.065135 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.065155 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.065169 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.167448 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.167489 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.167498 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.167512 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.167521 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.229697 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.229771 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.229800 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.229818 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.229834 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.229922 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.229968 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:46.229955759 +0000 UTC m=+93.471836367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.229982 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230071 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230112 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230125 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230139 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:46.230114554 +0000 UTC m=+93.471995232 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230180 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:46.230160296 +0000 UTC m=+93.472040994 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230236 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230246 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230250 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:51:46.230236548 +0000 UTC m=+93.472117266 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230253 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.230636 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:46.230620289 +0000 UTC m=+93.472501007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.270117 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.270166 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.270177 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.270195 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.270210 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.376486 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.376551 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.376568 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.376591 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.376608 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.479013 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.479186 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.479045 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.479255 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.479286 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.479302 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.479318 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.479376 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.479326 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: E0310 21:51:42.479432 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.479469 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.581952 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.582000 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.582009 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.582028 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.582039 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.688042 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.688089 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.688105 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.688128 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.688146 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.791125 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.791188 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.791210 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.791238 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.791258 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.893498 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.893571 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.893588 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.893613 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.893630 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.997377 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.997441 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.997453 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.997478 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:42 crc kubenswrapper[4919]: I0310 21:51:42.997489 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:42Z","lastTransitionTime":"2026-03-10T21:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.100792 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.100838 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.100850 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.100865 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.100877 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.203578 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.203625 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.203636 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.203653 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.203665 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.306837 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.306896 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.306917 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.306947 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.306974 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.410067 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.410305 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.410496 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.410596 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.410686 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.493300 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.506303 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.513412 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.513447 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.513458 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.513474 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.513486 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.519833 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.537195 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.548119 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.558772 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.615767 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.615795 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.615804 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.615817 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.615825 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.717853 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.717898 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.717911 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.717929 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.717941 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.820956 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.820999 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.821008 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.821082 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.821096 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.923342 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.923380 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.923415 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.923438 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:43 crc kubenswrapper[4919]: I0310 21:51:43.923450 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:43Z","lastTransitionTime":"2026-03-10T21:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.025637 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.025687 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.025696 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.025717 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.025728 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.129025 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.129079 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.129091 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.129109 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.129118 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.231910 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.231965 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.231984 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.232008 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.232026 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.334890 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.334919 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.334929 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.334941 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.334951 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.438102 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.438206 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.438225 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.438250 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.438266 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.479105 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.479204 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.479226 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:44 crc kubenswrapper[4919]: E0310 21:51:44.479360 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:44 crc kubenswrapper[4919]: E0310 21:51:44.479570 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:44 crc kubenswrapper[4919]: E0310 21:51:44.479630 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.540980 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.541009 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.541031 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.541045 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.541056 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.643771 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.643845 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.643862 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.643879 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.643893 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.747229 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.747285 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.747308 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.747324 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.747337 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.849473 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.849504 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.849512 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.849524 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.849532 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.952683 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.952753 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.952777 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.952803 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:44 crc kubenswrapper[4919]: I0310 21:51:44.952822 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:44Z","lastTransitionTime":"2026-03-10T21:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.055841 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.055883 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.055896 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.055914 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.055925 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.158416 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.158453 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.158463 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.158479 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.158489 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.261891 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.262162 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.262265 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.262350 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.262446 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.365571 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.365636 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.365648 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.365729 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.365744 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.468835 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.468911 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.468930 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.468959 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.468984 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.497172 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.571476 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.571533 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.571552 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.571575 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.571593 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.674511 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.674854 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.675209 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.675423 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.675589 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.779188 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.779254 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.779275 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.779300 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.779321 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.881583 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.882263 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.882335 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.882442 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.882528 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.985247 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.985725 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.985807 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.985892 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:45 crc kubenswrapper[4919]: I0310 21:51:45.985968 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:45Z","lastTransitionTime":"2026-03-10T21:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.089378 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.089618 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.089633 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.089651 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.089664 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.192629 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.192689 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.192700 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.192716 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.192750 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.269170 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.269257 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.269288 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.269309 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.269327 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269450 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269456 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:51:54.269418881 +0000 UTC m=+101.511299529 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269480 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269504 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:54.269490083 +0000 UTC m=+101.511370691 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269538 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:54.269519834 +0000 UTC m=+101.511400442 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269595 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269595 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269620 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269639 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269645 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269658 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269701 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:54.269687729 +0000 UTC m=+101.511568377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.269728 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:51:54.26971391 +0000 UTC m=+101.511594548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.294677 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.294783 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.294798 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.294814 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.294827 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.396977 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.397010 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.397018 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.397032 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.397041 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.479009 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.479067 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.479098 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.479131 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.479310 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:46 crc kubenswrapper[4919]: E0310 21:51:46.479429 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.499770 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.499822 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.499845 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.499872 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.499893 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.602690 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.602757 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.602782 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.602814 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.602837 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.705339 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.705367 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.705375 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.705410 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.705419 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.807830 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.807858 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.807866 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.807879 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.807887 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.910296 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.910372 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.910425 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.910452 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:46 crc kubenswrapper[4919]: I0310 21:51:46.910470 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:46Z","lastTransitionTime":"2026-03-10T21:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.014365 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.014861 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.015022 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.015169 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.015346 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.117859 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.117904 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.117916 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.117930 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.117943 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.221275 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.221431 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.221456 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.221481 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.221532 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.324582 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.324639 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.324660 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.324684 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.324703 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.428429 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.428509 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.428533 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.428567 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.428591 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.533005 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.533062 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.533079 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.533100 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.533117 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.635246 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.635301 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.635317 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.635338 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.635356 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.742683 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.742734 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.742746 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.742762 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.742774 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.845449 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.845521 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.845534 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.845581 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.845593 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.948847 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.948968 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.948988 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.949011 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:47 crc kubenswrapper[4919]: I0310 21:51:47.949027 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:47Z","lastTransitionTime":"2026-03-10T21:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.052621 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.052692 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.052713 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.052738 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.052758 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.156092 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.156170 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.156192 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.156223 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.156246 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.259051 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.259102 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.259120 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.259144 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.259163 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.362122 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.362195 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.362219 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.362252 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.362277 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.464685 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.465062 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.465167 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.465250 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.465332 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.479701 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:48 crc kubenswrapper[4919]: E0310 21:51:48.479942 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.479772 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:48 crc kubenswrapper[4919]: E0310 21:51:48.480784 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.479716 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:48 crc kubenswrapper[4919]: E0310 21:51:48.480939 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.568039 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.568343 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.568475 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.568585 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.568679 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.671077 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.671148 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.671176 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.671209 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.671231 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.774932 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.774999 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.775018 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.775041 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.775058 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.877164 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.877536 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.877689 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.877862 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.878018 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.980944 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.980993 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.981009 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.981030 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:48 crc kubenswrapper[4919]: I0310 21:51:48.981046 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:48Z","lastTransitionTime":"2026-03-10T21:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.083028 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.083078 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.083095 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.083117 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.083133 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.185977 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.186330 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.186489 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.186668 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.186787 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.289621 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.289727 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.289780 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.289805 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.289830 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.392440 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.392470 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.392478 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.392491 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.392501 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.495103 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.495151 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.495161 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.495178 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.495188 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.597067 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.597106 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.597115 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.597132 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.597141 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.699794 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.699854 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.699872 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.699895 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.699914 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.802823 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.802866 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.802875 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.802890 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.802901 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.905158 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.905221 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.905245 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.905275 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:49 crc kubenswrapper[4919]: I0310 21:51:49.905296 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:49Z","lastTransitionTime":"2026-03-10T21:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.008038 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.008097 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.008116 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.008138 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.008155 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.111374 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.111459 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.111477 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.111500 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.111518 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.214355 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.214423 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.214436 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.214468 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.214481 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.317478 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.317517 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.317527 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.317542 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.317550 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.420000 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.420036 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.420048 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.420065 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.420076 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.479332 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.479472 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.479584 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:50 crc kubenswrapper[4919]: E0310 21:51:50.479584 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:50 crc kubenswrapper[4919]: E0310 21:51:50.480095 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:50 crc kubenswrapper[4919]: E0310 21:51:50.480365 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.500177 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.522529 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.522579 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.522597 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.522619 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.522637 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.634183 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.634243 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.634264 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.634375 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.634474 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.737200 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.737269 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.737286 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.737310 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.737327 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.840348 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.840453 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.840472 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.840496 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.840515 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.942385 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.942456 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.942467 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.942517 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:50 crc kubenswrapper[4919]: I0310 21:51:50.942528 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:50Z","lastTransitionTime":"2026-03-10T21:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.045143 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.045184 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.045199 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.045222 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.045236 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.148699 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.148744 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.148759 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.148782 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.148799 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.254985 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.255044 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.255063 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.255089 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.255107 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.300103 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.300137 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.300148 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.300163 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.300174 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.314279 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.318435 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.318469 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.318484 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.318505 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.318530 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.332160 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.335470 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.335506 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.335517 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.335531 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.335542 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.349435 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.353608 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.353657 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.353674 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.353694 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.353709 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.364129 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.367981 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.368021 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.368038 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.368057 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.368074 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.376815 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.377181 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.379110 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.379160 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.379182 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.379209 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.379228 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.482593 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.482638 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.482655 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.482681 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.482698 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.482886 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:51 crc kubenswrapper[4919]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 21:51:51 crc kubenswrapper[4919]: if [[ -f "/env/_master" ]]; then Mar 10 21:51:51 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:51 crc kubenswrapper[4919]: source "/env/_master" Mar 10 21:51:51 crc kubenswrapper[4919]: set +o allexport Mar 10 21:51:51 crc kubenswrapper[4919]: fi Mar 10 21:51:51 crc kubenswrapper[4919]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 21:51:51 crc kubenswrapper[4919]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 21:51:51 crc kubenswrapper[4919]: ho_enable="--enable-hybrid-overlay" Mar 10 21:51:51 crc kubenswrapper[4919]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 21:51:51 crc kubenswrapper[4919]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 21:51:51 crc kubenswrapper[4919]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 21:51:51 crc kubenswrapper[4919]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 21:51:51 crc kubenswrapper[4919]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 21:51:51 crc kubenswrapper[4919]: --webhook-host=127.0.0.1 \ Mar 10 21:51:51 crc kubenswrapper[4919]: --webhook-port=9743 \ Mar 10 21:51:51 crc kubenswrapper[4919]: ${ho_enable} \ Mar 10 21:51:51 crc kubenswrapper[4919]: --enable-interconnect \ Mar 10 21:51:51 crc kubenswrapper[4919]: --disable-approver \ Mar 10 21:51:51 crc kubenswrapper[4919]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 21:51:51 crc kubenswrapper[4919]: --wait-for-kubernetes-api=200s \ Mar 10 21:51:51 crc kubenswrapper[4919]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 21:51:51 crc kubenswrapper[4919]: --loglevel="${LOGLEVEL}" Mar 10 21:51:51 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:51 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.483013 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:51 crc kubenswrapper[4919]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 21:51:51 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:51 crc kubenswrapper[4919]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 21:51:51 crc kubenswrapper[4919]: source /etc/kubernetes/apiserver-url.env Mar 10 21:51:51 crc kubenswrapper[4919]: else Mar 10 21:51:51 crc kubenswrapper[4919]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 21:51:51 crc kubenswrapper[4919]: exit 1 Mar 10 21:51:51 crc kubenswrapper[4919]: fi Mar 10 21:51:51 crc kubenswrapper[4919]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 21:51:51 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:51 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.484151 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.484916 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:51:51 crc kubenswrapper[4919]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 21:51:51 crc kubenswrapper[4919]: if [[ -f "/env/_master" ]]; then Mar 10 21:51:51 crc kubenswrapper[4919]: set -o allexport Mar 10 21:51:51 crc kubenswrapper[4919]: source "/env/_master" Mar 10 21:51:51 crc kubenswrapper[4919]: set +o allexport Mar 10 21:51:51 crc kubenswrapper[4919]: fi Mar 10 21:51:51 crc kubenswrapper[4919]: Mar 10 21:51:51 crc kubenswrapper[4919]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 21:51:51 crc kubenswrapper[4919]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 21:51:51 crc kubenswrapper[4919]: --disable-webhook \ Mar 10 21:51:51 crc kubenswrapper[4919]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 21:51:51 crc kubenswrapper[4919]: --loglevel="${LOGLEVEL}" Mar 10 21:51:51 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 21:51:51 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.486767 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.495323 4919 scope.go:117] "RemoveContainer" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.495473 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.495488 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.586022 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.586079 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.586096 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.586121 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.586138 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.689066 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.689133 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.689151 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.689180 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.689197 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.791663 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.791739 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.791762 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.791797 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.791821 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.848977 4919 scope.go:117] "RemoveContainer" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" Mar 10 21:51:51 crc kubenswrapper[4919]: E0310 21:51:51.849232 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.895754 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.895822 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.895842 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.895868 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.895886 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.998903 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.998960 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.998977 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.998999 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:51 crc kubenswrapper[4919]: I0310 21:51:51.999016 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:51Z","lastTransitionTime":"2026-03-10T21:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.102717 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.102784 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.102810 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.102841 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.102866 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.206523 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.206576 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.206597 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.206621 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.206638 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.309847 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.309928 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.309951 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.309982 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.310011 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.413594 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.413666 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.413688 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.413719 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.413741 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.479726 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.479779 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:52 crc kubenswrapper[4919]: E0310 21:51:52.479906 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.479941 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:52 crc kubenswrapper[4919]: E0310 21:51:52.480164 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:52 crc kubenswrapper[4919]: E0310 21:51:52.480262 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.516644 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.516704 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.516721 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.516743 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.516760 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.619676 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.619744 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.619761 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.619785 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.619803 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.723508 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.723579 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.723601 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.723627 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.723655 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.826361 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.826480 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.826505 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.826529 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.826547 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.933635 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.933706 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.933724 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.933750 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:52 crc kubenswrapper[4919]: I0310 21:51:52.933804 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:52Z","lastTransitionTime":"2026-03-10T21:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.035993 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.036048 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.036067 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.036090 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.036108 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.139009 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.139596 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.139696 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.139783 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.139877 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.243499 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.243850 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.243978 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.244113 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.244272 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.348076 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.348205 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.348231 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.348273 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.348301 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.452487 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.452568 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.452587 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.452620 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.452638 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: E0310 21:51:53.482019 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 21:51:53 crc kubenswrapper[4919]: E0310 21:51:53.483635 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.497995 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.513094 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.531831 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.551292 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.555786 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.555862 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.555893 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.555927 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.555946 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.565491 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.597282 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.617582 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.635089 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.650659 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.659559 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.659614 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.659634 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.659657 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.659673 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.762747 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.762799 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.762817 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.762840 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.762860 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.865691 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.865753 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.865785 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.865811 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.865831 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.968334 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.968607 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.968750 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.968844 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:53 crc kubenswrapper[4919]: I0310 21:51:53.968920 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:53Z","lastTransitionTime":"2026-03-10T21:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.071469 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.071509 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.071519 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.071534 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.071544 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.173337 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.173417 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.173430 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.173448 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.173461 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.276970 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.277014 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.277043 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.277060 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.277072 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.341495 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.341720 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.341842 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.341862 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.341882 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.341893 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.341915 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.341920 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.341983 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:52:10.341961731 +0000 UTC m=+117.583842369 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342054 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342174 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342182 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:52:10.342143026 +0000 UTC m=+117.584023674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342233 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:52:10.342215238 +0000 UTC m=+117.584095886 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342377 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342447 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342469 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342540 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:52:10.342518877 +0000 UTC m=+117.584399585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.342717 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:52:10.342695632 +0000 UTC m=+117.584576270 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.379913 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.379999 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.380021 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.380046 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.380064 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.479500 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.479681 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.479500 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.479532 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.479950 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:54 crc kubenswrapper[4919]: E0310 21:51:54.480076 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.483652 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.483753 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.483781 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.483860 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.483886 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.587775 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.587858 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.587880 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.587911 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.587936 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.692092 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.692161 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.692185 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.692213 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.692235 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.795499 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.795588 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.795612 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.795642 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.795663 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.898511 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.898580 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.898601 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.898629 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:54 crc kubenswrapper[4919]: I0310 21:51:54.898648 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:54Z","lastTransitionTime":"2026-03-10T21:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.001386 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.001642 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.001764 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.001866 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.001938 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.104759 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.104822 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.104841 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.104868 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.104887 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.207450 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.208176 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.208237 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.208278 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.208301 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.312643 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.312709 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.312728 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.312756 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.312779 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.415694 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.415771 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.415792 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.415821 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.415843 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.519230 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.519308 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.519330 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.519359 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.519382 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.622457 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.622544 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.622561 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.622586 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.622606 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.671348 4919 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.726000 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.726071 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.726090 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.726116 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.726136 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.830164 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.830237 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.830257 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.830289 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.830308 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.933962 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.934027 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.934051 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.934082 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:55 crc kubenswrapper[4919]: I0310 21:51:55.934104 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:55Z","lastTransitionTime":"2026-03-10T21:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.037927 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.037987 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.038006 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.038045 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.038065 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.142070 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.142515 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.142533 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.142560 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.142580 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.246069 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.246129 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.246143 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.246163 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.246175 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.349625 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.349675 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.349687 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.349704 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.349717 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.452123 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.452200 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.452225 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.452254 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.452274 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.479169 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:56 crc kubenswrapper[4919]: E0310 21:51:56.479317 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.479523 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:56 crc kubenswrapper[4919]: E0310 21:51:56.479643 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.479702 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:56 crc kubenswrapper[4919]: E0310 21:51:56.479746 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.556085 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.556141 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.556155 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.556176 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.556193 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.659888 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.659968 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.659993 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.660020 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.660037 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.762545 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.762617 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.762632 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.762653 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.762669 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.865223 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.865291 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.865310 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.865336 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.865355 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.968764 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.968834 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.968847 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.968874 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:56 crc kubenswrapper[4919]: I0310 21:51:56.968888 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:56Z","lastTransitionTime":"2026-03-10T21:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.073736 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.073824 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.073861 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.073900 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.073925 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.177600 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.177690 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.177711 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.177748 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.177769 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.280955 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.280996 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.281015 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.281046 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.281084 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.384001 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.384066 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.384084 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.384117 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.384134 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.486142 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.486197 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.486214 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.486238 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.486256 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.588907 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.588954 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.588970 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.588996 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.589014 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.692137 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.692567 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.692752 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.692903 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.693047 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.796163 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.796564 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.796777 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.796994 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.797197 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.900836 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.900910 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.900928 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.900957 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:57 crc kubenswrapper[4919]: I0310 21:51:57.900976 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:57Z","lastTransitionTime":"2026-03-10T21:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.004266 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.004318 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.004337 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.004361 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.004378 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.112218 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.112298 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.112321 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.112346 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.112366 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.215109 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.215182 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.215199 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.215222 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.215239 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.318170 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.318251 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.318274 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.318306 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.318330 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.421649 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.421696 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.421708 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.421728 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.421744 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.444910 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hzq7c"] Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.445365 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hzq7c" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.448097 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.449705 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.450153 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.463901 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.480034 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.480028 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: E0310 21:51:58.480175 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.480272 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:51:58 crc kubenswrapper[4919]: E0310 21:51:58.480359 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.480917 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:51:58 crc kubenswrapper[4919]: E0310 21:51:58.481327 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.482907 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw7c6\" (UniqueName: \"kubernetes.io/projected/e92ce303-b70d-4416-b8f1-520b49dca2e6-kube-api-access-qw7c6\") pod \"node-resolver-hzq7c\" (UID: \"e92ce303-b70d-4416-b8f1-520b49dca2e6\") " pod="openshift-dns/node-resolver-hzq7c" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.482955 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e92ce303-b70d-4416-b8f1-520b49dca2e6-hosts-file\") pod \"node-resolver-hzq7c\" (UID: \"e92ce303-b70d-4416-b8f1-520b49dca2e6\") " pod="openshift-dns/node-resolver-hzq7c" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.497427 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.513134 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.524961 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.525014 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.525034 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.525057 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.525075 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.525912 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.557022 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.576746 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.583655 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw7c6\" (UniqueName: \"kubernetes.io/projected/e92ce303-b70d-4416-b8f1-520b49dca2e6-kube-api-access-qw7c6\") pod \"node-resolver-hzq7c\" (UID: \"e92ce303-b70d-4416-b8f1-520b49dca2e6\") " pod="openshift-dns/node-resolver-hzq7c" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.583726 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e92ce303-b70d-4416-b8f1-520b49dca2e6-hosts-file\") pod \"node-resolver-hzq7c\" (UID: \"e92ce303-b70d-4416-b8f1-520b49dca2e6\") " pod="openshift-dns/node-resolver-hzq7c" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.583926 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e92ce303-b70d-4416-b8f1-520b49dca2e6-hosts-file\") pod \"node-resolver-hzq7c\" (UID: \"e92ce303-b70d-4416-b8f1-520b49dca2e6\") " pod="openshift-dns/node-resolver-hzq7c" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.589249 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.608829 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.619689 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw7c6\" (UniqueName: \"kubernetes.io/projected/e92ce303-b70d-4416-b8f1-520b49dca2e6-kube-api-access-qw7c6\") pod \"node-resolver-hzq7c\" (UID: \"e92ce303-b70d-4416-b8f1-520b49dca2e6\") " pod="openshift-dns/node-resolver-hzq7c" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.625901 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.627668 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.627861 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.627992 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.628116 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.628239 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.731603 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.731654 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.731672 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.731699 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.731716 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.765223 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hzq7c" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.842144 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.842220 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.842239 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.842265 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.842282 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.843111 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-z7v4t"] Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.843477 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-z6pc7"] Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.843837 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.844571 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-hbw8v"] Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.845271 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.844951 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.846998 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.848083 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.849131 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.849259 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.849315 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.850435 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.851355 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.851687 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.851719 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.851813 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.851980 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.853057 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.866959 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.875466 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hzq7c" event={"ID":"e92ce303-b70d-4416-b8f1-520b49dca2e6","Type":"ContainerStarted","Data":"3e2ccbc10c9983130597f44f500f75c4fc00f1cc384bc3493cf6b1579bf0a4b5"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.879303 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.885996 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-cni-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886036 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-hostroot\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886068 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-os-release\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886087 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-os-release\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886106 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-kubelet\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886124 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-daemon-config\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886156 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/566678d1-f416-4116-ab20-b30dceb86cdc-mcd-auth-proxy-config\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886177 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-system-cni-dir\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886200 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886219 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-cni-multus\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886245 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886265 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-etc-kubernetes\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886287 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhvw\" (UniqueName: \"kubernetes.io/projected/566678d1-f416-4116-ab20-b30dceb86cdc-kube-api-access-hrhvw\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886310 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/566678d1-f416-4116-ab20-b30dceb86cdc-rootfs\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886330 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdcd\" (UniqueName: \"kubernetes.io/projected/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-kube-api-access-5jdcd\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886371 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-system-cni-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886416 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-netns\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886438 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-multus-certs\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886458 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-k8s-cni-cncf-io\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886475 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwtj4\" (UniqueName: \"kubernetes.io/projected/6a5db7c3-2a96-4030-8c88-5d82d325b62d-kube-api-access-dwtj4\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886502 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/566678d1-f416-4116-ab20-b30dceb86cdc-proxy-tls\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886519 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-cnibin\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886541 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-cni-bin\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886561 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886581 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a5db7c3-2a96-4030-8c88-5d82d325b62d-cni-binary-copy\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886599 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-socket-dir-parent\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886617 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-conf-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.886636 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cnibin\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.896527 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.911919 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.929726 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.941625 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.944617 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.944661 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.944678 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.944703 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.944722 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:58Z","lastTransitionTime":"2026-03-10T21:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.951045 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.991831 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.992807 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.992851 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-etc-kubernetes\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.992877 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhvw\" (UniqueName: \"kubernetes.io/projected/566678d1-f416-4116-ab20-b30dceb86cdc-kube-api-access-hrhvw\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.992900 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/566678d1-f416-4116-ab20-b30dceb86cdc-rootfs\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.992922 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdcd\" (UniqueName: \"kubernetes.io/projected/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-kube-api-access-5jdcd\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.992942 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-system-cni-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.992960 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-netns\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.992982 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-multus-certs\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993001 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-k8s-cni-cncf-io\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993020 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwtj4\" (UniqueName: \"kubernetes.io/projected/6a5db7c3-2a96-4030-8c88-5d82d325b62d-kube-api-access-dwtj4\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993047 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/566678d1-f416-4116-ab20-b30dceb86cdc-proxy-tls\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993065 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-cnibin\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993085 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-cni-bin\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993104 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993122 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a5db7c3-2a96-4030-8c88-5d82d325b62d-cni-binary-copy\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993141 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-socket-dir-parent\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993160 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-conf-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993178 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cnibin\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993215 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-cni-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993247 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-os-release\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993265 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-os-release\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993282 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-hostroot\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993316 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993336 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-cni-multus\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993355 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-kubelet\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993373 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-daemon-config\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993425 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/566678d1-f416-4116-ab20-b30dceb86cdc-mcd-auth-proxy-config\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993445 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-system-cni-dir\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.993528 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-system-cni-dir\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.994802 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.994858 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-etc-kubernetes\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.995167 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/566678d1-f416-4116-ab20-b30dceb86cdc-rootfs\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.995480 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-system-cni-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:58 crc kubenswrapper[4919]: I0310 21:51:58.995516 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-netns\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.995553 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-multus-certs\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.995593 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-run-k8s-cni-cncf-io\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.996316 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-cni-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.996456 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-socket-dir-parent\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.996519 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-conf-dir\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.996567 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cnibin\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.997377 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.997512 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-os-release\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.997589 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-os-release\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.997636 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-hostroot\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.999530 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-kubelet\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:58.999623 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-cni-multus\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.000347 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-cni-binary-copy\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.000430 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-cnibin\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.004409 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/566678d1-f416-4116-ab20-b30dceb86cdc-mcd-auth-proxy-config\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.006112 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a5db7c3-2a96-4030-8c88-5d82d325b62d-cni-binary-copy\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.006845 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/566678d1-f416-4116-ab20-b30dceb86cdc-proxy-tls\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.008991 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a5db7c3-2a96-4030-8c88-5d82d325b62d-multus-daemon-config\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.009417 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a5db7c3-2a96-4030-8c88-5d82d325b62d-host-var-lib-cni-bin\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.009869 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.027525 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhvw\" (UniqueName: \"kubernetes.io/projected/566678d1-f416-4116-ab20-b30dceb86cdc-kube-api-access-hrhvw\") pod \"machine-config-daemon-z7v4t\" (UID: \"566678d1-f416-4116-ab20-b30dceb86cdc\") " pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.027645 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdcd\" (UniqueName: \"kubernetes.io/projected/2ace27ab-c4c7-412b-9ae8-a3e4ff15faec-kube-api-access-5jdcd\") pod \"multus-additional-cni-plugins-z6pc7\" (UID: \"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\") " pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.028016 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwtj4\" (UniqueName: \"kubernetes.io/projected/6a5db7c3-2a96-4030-8c88-5d82d325b62d-kube-api-access-dwtj4\") pod \"multus-hbw8v\" (UID: \"6a5db7c3-2a96-4030-8c88-5d82d325b62d\") " pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.046936 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.047143 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.047264 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.047278 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.047293 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.047304 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.055352 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.064784 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.075878 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.087213 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.098156 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.105949 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.115914 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.125497 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.142263 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.150057 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.150143 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.150179 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.150207 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.150264 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.152154 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.161671 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.168986 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.174907 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.175268 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.191939 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.197417 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4dp67"] Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.198643 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.201322 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.201573 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.201724 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.201839 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.202015 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.202818 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hbw8v" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.202987 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.203183 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.214060 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.217536 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.225635 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: W0310 21:51:59.228304 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5db7c3_2a96_4030_8c88_5d82d325b62d.slice/crio-2de602fdb8f9552b4463b08f0050a917c1afb7442065c99f6f36f75821b4393a WatchSource:0}: Error finding container 2de602fdb8f9552b4463b08f0050a917c1afb7442065c99f6f36f75821b4393a: Status 404 returned error can't find the container with id 2de602fdb8f9552b4463b08f0050a917c1afb7442065c99f6f36f75821b4393a Mar 10 21:51:59 crc kubenswrapper[4919]: W0310 21:51:59.231878 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ace27ab_c4c7_412b_9ae8_a3e4ff15faec.slice/crio-c1f4ffa5088c8d6a6ab332ee548fe46ff853e49b8ad47915b9cf130494e24ed4 WatchSource:0}: Error finding container c1f4ffa5088c8d6a6ab332ee548fe46ff853e49b8ad47915b9cf130494e24ed4: Status 404 returned error can't find the container with id c1f4ffa5088c8d6a6ab332ee548fe46ff853e49b8ad47915b9cf130494e24ed4 Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.236928 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.250334 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.252800 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.252901 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.252931 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.252957 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.252976 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.260654 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.270783 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.280551 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.293349 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.301266 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.315901 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.324660 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.351832 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.361346 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.361376 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.361400 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.361417 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.361426 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.366437 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.380171 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396521 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-var-lib-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396574 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-ovn\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396604 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-config\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396654 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-log-socket\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396675 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396725 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-node-log\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396762 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-env-overrides\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396787 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5rvw\" (UniqueName: \"kubernetes.io/projected/a2e7c6fb-9e33-441d-9197-719929eb9e21-kube-api-access-c5rvw\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396819 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-kubelet\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396839 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-netns\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396858 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-netd\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396878 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-script-lib\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396897 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-slash\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396916 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-ovn-kubernetes\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396935 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396954 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-bin\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396977 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-etc-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.396997 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovn-node-metrics-cert\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.397019 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-systemd\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.397368 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-systemd-units\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.465617 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.465666 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.465683 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.465705 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.465722 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.498708 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-etc-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.498759 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovn-node-metrics-cert\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.498783 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-systemd\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.498898 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-etc-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.498975 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-systemd\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499144 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-systemd-units\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499337 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-systemd-units\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499713 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-var-lib-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499787 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-ovn\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499824 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-config\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499855 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-var-lib-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499884 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-log-socket\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499890 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-ovn\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499924 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.499965 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-node-log\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500013 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-log-socket\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500037 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-openvswitch\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500074 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-node-log\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500057 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-env-overrides\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500131 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5rvw\" (UniqueName: \"kubernetes.io/projected/a2e7c6fb-9e33-441d-9197-719929eb9e21-kube-api-access-c5rvw\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500173 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-kubelet\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500194 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-netns\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500216 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-netd\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500236 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-script-lib\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500258 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-slash\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500277 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-netd\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500283 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-netns\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500280 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-ovn-kubernetes\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500235 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-kubelet\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500312 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-ovn-kubernetes\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500337 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-slash\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500344 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500376 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500377 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-bin\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500490 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-bin\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.500870 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-env-overrides\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.501057 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-script-lib\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.501659 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-config\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.503929 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovn-node-metrics-cert\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.519281 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5rvw\" (UniqueName: \"kubernetes.io/projected/a2e7c6fb-9e33-441d-9197-719929eb9e21-kube-api-access-c5rvw\") pod \"ovnkube-node-4dp67\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.568086 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.568117 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.568127 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.568147 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.568157 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.670583 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.670620 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.670630 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.670646 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.670700 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.772808 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.772855 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.772867 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.772884 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.772895 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.813093 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.875722 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.875794 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.875806 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.875847 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.875862 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.880666 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.880707 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.880721 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"24716946b8aa3c9b81e5de4f096bafc8394795224d36404b7b52d817376c4ec5"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.884694 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hzq7c" event={"ID":"e92ce303-b70d-4416-b8f1-520b49dca2e6","Type":"ContainerStarted","Data":"6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.886767 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"d527ec7ec526f114e00e9b88c707cae3e833fb2d8c9853761e82310aa9ca2239"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.888365 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbw8v" event={"ID":"6a5db7c3-2a96-4030-8c88-5d82d325b62d","Type":"ContainerStarted","Data":"abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.888427 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbw8v" event={"ID":"6a5db7c3-2a96-4030-8c88-5d82d325b62d","Type":"ContainerStarted","Data":"2de602fdb8f9552b4463b08f0050a917c1afb7442065c99f6f36f75821b4393a"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.891184 4919 generic.go:334] "Generic (PLEG): container finished" podID="2ace27ab-c4c7-412b-9ae8-a3e4ff15faec" containerID="d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa" exitCode=0 Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.891248 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" event={"ID":"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec","Type":"ContainerDied","Data":"d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.891329 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" event={"ID":"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec","Type":"ContainerStarted","Data":"c1f4ffa5088c8d6a6ab332ee548fe46ff853e49b8ad47915b9cf130494e24ed4"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.900910 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.919515 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.935642 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.958874 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.976610 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.979451 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.979493 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.979510 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.979537 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.979556 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:51:59Z","lastTransitionTime":"2026-03-10T21:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:51:59 crc kubenswrapper[4919]: I0310 21:51:59.988898 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.006726 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.016851 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.036692 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.054734 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.066682 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.077108 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.081849 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.081899 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.081915 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.081934 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.081946 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.088655 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.106779 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.128629 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.146591 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.166005 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.180902 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.185152 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.185209 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.185223 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.185247 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.185264 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.197028 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.214786 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.228731 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.243637 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.256242 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.280808 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.288025 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.288061 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.288073 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.288098 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.288113 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.289308 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.316179 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.329624 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.353582 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.391128 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.391336 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.391423 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.391497 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.391569 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.479187 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.479198 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:00 crc kubenswrapper[4919]: E0310 21:52:00.479321 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:00 crc kubenswrapper[4919]: E0310 21:52:00.479629 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.479744 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:00 crc kubenswrapper[4919]: E0310 21:52:00.480001 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.494208 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.494268 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.494291 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.494320 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.494344 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.597052 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.597261 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.597327 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.597409 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.597480 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.701695 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.701746 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.701763 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.701787 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.701804 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.804763 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.805165 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.805502 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.805607 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.805721 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.896565 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493" exitCode=0 Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.896637 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.902759 4919 generic.go:334] "Generic (PLEG): container finished" podID="2ace27ab-c4c7-412b-9ae8-a3e4ff15faec" containerID="a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d" exitCode=0 Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.902838 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" event={"ID":"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec","Type":"ContainerDied","Data":"a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.910532 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.910579 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.910597 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.910621 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.910639 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:00Z","lastTransitionTime":"2026-03-10T21:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.931803 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.946179 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:00 crc kubenswrapper[4919]: I0310 21:52:00.981624 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.004237 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.016207 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.016253 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.016270 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.016295 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.016314 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.019140 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.032941 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.046128 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.061004 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.075980 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.091965 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.108536 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.118944 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.118979 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.118992 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.119011 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.119023 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.126116 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.140689 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.152861 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.162747 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.183664 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.198471 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.209519 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.219670 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.223065 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.223092 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.223103 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.223117 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.223125 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.227678 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.241747 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.252366 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.262076 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.270638 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.286272 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.297451 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.307430 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.319242 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.326504 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.326536 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.326546 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.326561 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.326571 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.428563 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.428596 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.428607 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.428629 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.428640 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.530815 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.530852 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.530862 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.530877 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.530887 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.608949 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.608994 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.609005 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.609022 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.609032 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: E0310 21:52:01.618662 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.621617 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.621647 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.621655 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.621672 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.621682 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: E0310 21:52:01.629511 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.632468 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.632496 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.632504 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.632517 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.632526 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: E0310 21:52:01.640548 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.643849 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.643897 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.643914 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.643939 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.643957 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: E0310 21:52:01.653868 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.657210 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.657245 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.657258 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.657274 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.657286 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: E0310 21:52:01.666579 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: E0310 21:52:01.666706 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.668272 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.668309 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.668321 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.668338 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.668359 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.771013 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.771075 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.771093 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.771117 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.771135 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.874732 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.874844 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.874874 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.874906 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.874925 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.910079 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.910169 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.910200 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.910222 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.910245 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.910268 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.913250 4919 generic.go:334] "Generic (PLEG): container finished" podID="2ace27ab-c4c7-412b-9ae8-a3e4ff15faec" containerID="d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843" exitCode=0 Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.913318 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" event={"ID":"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec","Type":"ContainerDied","Data":"d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.932316 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.945927 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.958239 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.969935 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.976874 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.976918 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.976927 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.976941 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.976950 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:01Z","lastTransitionTime":"2026-03-10T21:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.988290 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:01 crc kubenswrapper[4919]: I0310 21:52:01.996465 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.006523 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.017819 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.025953 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.050552 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.059746 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.076085 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.078653 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.078680 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.078689 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.078701 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.078712 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.093923 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.105758 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.181580 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.181671 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.181697 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.181730 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.181758 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.285464 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.285523 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.285539 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.285561 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.285578 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.389071 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.389131 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.389149 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.389172 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.389191 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.484183 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.484240 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:02 crc kubenswrapper[4919]: E0310 21:52:02.484325 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:02 crc kubenswrapper[4919]: E0310 21:52:02.484499 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.484637 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:02 crc kubenswrapper[4919]: E0310 21:52:02.484784 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.494106 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.494156 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.494175 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.494199 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.494216 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.596872 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.596915 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.596929 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.596945 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.596956 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.699591 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.699661 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.699683 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.699720 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.699745 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.802597 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.802660 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.802677 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.802705 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.802721 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.906069 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.906433 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.906634 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.906790 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.906934 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:02Z","lastTransitionTime":"2026-03-10T21:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.919981 4919 generic.go:334] "Generic (PLEG): container finished" podID="2ace27ab-c4c7-412b-9ae8-a3e4ff15faec" containerID="cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12" exitCode=0 Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.920051 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" event={"ID":"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec","Type":"ContainerDied","Data":"cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12"} Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.937510 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.952775 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.972040 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.984586 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:02 crc kubenswrapper[4919]: I0310 21:52:02.998702 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.009941 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.009981 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.009992 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.010010 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.010022 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.014346 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.027847 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.036245 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.053604 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.063566 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.078342 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.090797 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.100830 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.112540 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.113474 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.113498 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.113507 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.113520 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.113529 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.215168 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.215196 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.215206 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.215221 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.215231 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.317990 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.318036 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.318048 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.318064 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.318073 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.420899 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.420942 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.420954 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.420975 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.420988 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.494374 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.506565 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.516442 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.523933 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.523979 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.523992 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.524009 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.524022 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.529870 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.552840 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.564258 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.574337 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.583586 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.590753 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.604474 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.615639 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.626226 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.626265 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.626279 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.626299 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.626301 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.626311 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.635802 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.646141 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: E0310 21:52:03.670325 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ace27ab_c4c7_412b_9ae8_a3e4ff15faec.slice/crio-conmon-5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6.scope\": RecentStats: unable to find data in memory cache]" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.728260 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.728297 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.728306 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.728320 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.728330 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.830425 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.830465 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.830474 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.830489 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.830500 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.927195 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.930722 4919 generic.go:334] "Generic (PLEG): container finished" podID="2ace27ab-c4c7-412b-9ae8-a3e4ff15faec" containerID="5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6" exitCode=0 Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.930806 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" event={"ID":"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec","Type":"ContainerDied","Data":"5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.931740 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.931763 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.931773 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.931786 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.931794 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:03Z","lastTransitionTime":"2026-03-10T21:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.933013 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.933041 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663"} Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.940154 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.966898 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.977492 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.988007 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:03 crc kubenswrapper[4919]: I0310 21:52:03.998019 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.006204 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.024475 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.033973 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.034380 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.034436 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.034446 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.034468 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.034478 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.043082 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.051072 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.061868 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.069787 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.077087 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.085309 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.099304 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.107747 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.117906 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.128589 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.136355 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.136411 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.136421 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.136435 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.136446 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.138684 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.157645 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.167914 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.187910 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.207379 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.223426 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.232977 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.239595 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.239630 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.239642 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.239658 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.239669 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.245793 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.265513 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.282163 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.342421 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.342476 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.342493 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.342517 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.342534 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.445054 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.445100 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.445116 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.445140 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.445157 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.479657 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.479698 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:04 crc kubenswrapper[4919]: E0310 21:52:04.479779 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:04 crc kubenswrapper[4919]: E0310 21:52:04.479926 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.479657 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:04 crc kubenswrapper[4919]: E0310 21:52:04.480432 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.547277 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.547320 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.547330 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.547343 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.547353 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.650121 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.650162 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.650173 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.650193 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.650205 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.752441 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.752514 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.752538 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.752567 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.752659 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.855048 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.855105 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.855125 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.855149 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.855166 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.942525 4919 generic.go:334] "Generic (PLEG): container finished" podID="2ace27ab-c4c7-412b-9ae8-a3e4ff15faec" containerID="f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155" exitCode=0 Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.942651 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" event={"ID":"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec","Type":"ContainerDied","Data":"f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.958871 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.958916 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.958935 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.958960 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.958977 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:04Z","lastTransitionTime":"2026-03-10T21:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.961046 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:04 crc kubenswrapper[4919]: I0310 21:52:04.997167 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:04Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.004119 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-b625p"] Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.006233 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.008675 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.009073 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.010587 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.011200 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.017853 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.036638 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.049232 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.058730 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b82448f1-4387-4d1a-a300-29f4b3d86bbf-serviceca\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.058821 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b82448f1-4387-4d1a-a300-29f4b3d86bbf-host\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.058893 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9q9t\" (UniqueName: \"kubernetes.io/projected/b82448f1-4387-4d1a-a300-29f4b3d86bbf-kube-api-access-s9q9t\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.062760 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.062815 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.062830 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.062853 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.062868 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.064929 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.096752 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.111699 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.125405 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.139601 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.159257 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b82448f1-4387-4d1a-a300-29f4b3d86bbf-serviceca\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.159349 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b82448f1-4387-4d1a-a300-29f4b3d86bbf-host\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.159522 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9q9t\" (UniqueName: \"kubernetes.io/projected/b82448f1-4387-4d1a-a300-29f4b3d86bbf-kube-api-access-s9q9t\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.159688 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b82448f1-4387-4d1a-a300-29f4b3d86bbf-host\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.160862 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b82448f1-4387-4d1a-a300-29f4b3d86bbf-serviceca\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.162735 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.164816 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.164854 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.164864 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.164882 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.164890 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.178261 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.181212 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9q9t\" (UniqueName: \"kubernetes.io/projected/b82448f1-4387-4d1a-a300-29f4b3d86bbf-kube-api-access-s9q9t\") pod \"node-ca-b625p\" (UID: \"b82448f1-4387-4d1a-a300-29f4b3d86bbf\") " pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.196435 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.211737 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.227704 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.243687 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.257228 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.267288 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.267333 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.267375 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.267409 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.267421 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.269688 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.290651 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.307731 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.324612 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.338765 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b625p" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.339655 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.359710 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: W0310 21:52:05.363174 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb82448f1_4387_4d1a_a300_29f4b3d86bbf.slice/crio-df200a861e83434addd030db1f2ed47e86e1bcae4a81745babe6c7817f0b5e4b WatchSource:0}: Error finding container df200a861e83434addd030db1f2ed47e86e1bcae4a81745babe6c7817f0b5e4b: Status 404 returned error can't find the container with id df200a861e83434addd030db1f2ed47e86e1bcae4a81745babe6c7817f0b5e4b Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.369863 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.369907 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.369924 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.369946 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.369963 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.392463 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.412290 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.428678 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.446492 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.463622 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.474128 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.474370 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.474386 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.474450 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.474471 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.489363 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.577616 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.577684 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.577708 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.577737 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.577757 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.680406 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.680459 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.680472 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.680490 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.680502 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.782859 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.782901 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.782934 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.782959 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.782973 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.884971 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.885017 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.885035 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.885057 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.885076 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.949016 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b625p" event={"ID":"b82448f1-4387-4d1a-a300-29f4b3d86bbf","Type":"ContainerStarted","Data":"c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.949066 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b625p" event={"ID":"b82448f1-4387-4d1a-a300-29f4b3d86bbf","Type":"ContainerStarted","Data":"df200a861e83434addd030db1f2ed47e86e1bcae4a81745babe6c7817f0b5e4b"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.956324 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" event={"ID":"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec","Type":"ContainerStarted","Data":"607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.961809 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437"} Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.962594 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.962654 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.962680 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.975471 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:05Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.987883 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.987953 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.987974 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.988005 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:05 crc kubenswrapper[4919]: I0310 21:52:05.988027 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:05Z","lastTransitionTime":"2026-03-10T21:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.025320 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.029224 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.032573 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.040247 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.054863 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.072246 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.091323 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.091469 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.091542 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.091554 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.091596 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.091610 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.105275 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.117047 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.145061 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.157945 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.185733 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.194528 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.194588 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.194607 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.194629 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.194645 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.202779 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.216035 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.227737 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.240494 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.252543 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.273605 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.293746 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.297764 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.297798 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.297807 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.297824 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.297834 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.310635 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.322787 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.333144 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.359836 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.376655 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.389363 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.399703 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.399735 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.399747 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.399765 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.399777 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.404956 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.419992 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.442233 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.458416 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.471450 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.479285 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.479316 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.479772 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:06 crc kubenswrapper[4919]: E0310 21:52:06.479923 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:06 crc kubenswrapper[4919]: E0310 21:52:06.480299 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:06 crc kubenswrapper[4919]: E0310 21:52:06.480673 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.480795 4919 scope.go:117] "RemoveContainer" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.485562 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.502558 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.502600 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.502618 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.502641 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.502661 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.605596 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.605948 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.605960 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.605976 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.605985 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.708161 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.708190 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.708200 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.708213 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.708224 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.810857 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.810903 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.810917 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.810937 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.810951 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.914259 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.914299 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.914317 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.914340 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.914357 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:06Z","lastTransitionTime":"2026-03-10T21:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.970234 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.973185 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.974021 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.975691 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e"} Mar 10 21:52:06 crc kubenswrapper[4919]: I0310 21:52:06.994492 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:06Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.012944 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.017071 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.017136 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.017155 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.017181 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.017198 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.034168 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.052228 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.068053 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.082249 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.091419 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.110012 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.118800 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.119542 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.119567 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.119575 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.119588 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.119597 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.137590 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.154608 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.167099 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.181188 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.193754 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.203162 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.211974 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.221671 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.221709 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.221719 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.221733 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.221745 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.223780 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.239610 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.255055 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.272660 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.285478 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.296827 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.323572 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.323615 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.323624 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.323638 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.323649 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.342063 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.401057 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.417831 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.426265 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.426297 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.426307 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.426346 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.426358 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.458137 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.498842 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.528412 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.528435 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.528444 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.528457 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.528466 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.540010 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.585339 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.615272 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:07Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.631088 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.631112 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.631120 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.631132 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.631141 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.732810 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.732845 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.732855 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.732871 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.732881 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.835635 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.835659 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.835667 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.835680 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.835695 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.938435 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.938476 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.938486 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.938500 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:07 crc kubenswrapper[4919]: I0310 21:52:07.938508 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:07Z","lastTransitionTime":"2026-03-10T21:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.041042 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.041078 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.041088 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.041105 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.041115 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.144009 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.144043 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.144051 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.144064 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.144073 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.246888 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.246948 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.246965 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.246988 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.247005 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.349325 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.349353 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.349362 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.349375 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.349383 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.451516 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.451554 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.451562 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.451580 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.451589 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.479181 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.479185 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.479191 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:08 crc kubenswrapper[4919]: E0310 21:52:08.479427 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:08 crc kubenswrapper[4919]: E0310 21:52:08.479285 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:08 crc kubenswrapper[4919]: E0310 21:52:08.479586 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.553853 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.553891 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.553902 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.553916 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.553925 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.655808 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.655837 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.655845 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.655857 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.655867 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.758344 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.758383 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.758404 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.758418 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.758427 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.861261 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.861321 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.861338 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.861364 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.861381 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.964489 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.964561 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.964580 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.964612 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.964629 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:08Z","lastTransitionTime":"2026-03-10T21:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.984526 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.988239 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/0.log" Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.993005 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437" exitCode=1 Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.993090 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437"} Mar 10 21:52:08 crc kubenswrapper[4919]: I0310 21:52:08.994241 4919 scope.go:117] "RemoveContainer" containerID="6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.009983 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.027894 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.056708 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.068864 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.069385 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.069448 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.069503 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.069529 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.085373 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.100537 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.121147 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.138609 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.167870 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.176313 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.176353 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.176366 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.176409 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.176423 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.183679 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.225684 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.249600 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.269601 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.278770 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.278811 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.278822 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.278839 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.278852 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.287772 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.306054 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.321479 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.333769 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.345748 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.358548 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.381718 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.381778 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.381795 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.381818 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.381834 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.382136 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.398596 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.413128 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.431939 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.444985 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.473553 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"message\\\":\\\"310 21:52:08.649483 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 21:52:08.649499 6773 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:08.649511 6773 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:08.649515 6773 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:08.649539 6773 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:08.649551 6773 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 21:52:08.649555 6773 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 21:52:08.649598 6773 factory.go:656] Stopping watch factory\\\\nI0310 21:52:08.649614 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 21:52:08.649674 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 21:52:08.649684 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 21:52:08.649689 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:08.649695 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:08.649702 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:08.649707 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:08.649713 6773 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.483685 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.483724 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.483736 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.483752 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.483765 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.490967 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.519963 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.534190 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.551033 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.571157 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.586677 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.586720 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.586730 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.586745 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.586754 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.591986 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:09Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.688711 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.688751 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.688762 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.688778 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.688790 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.792061 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.792106 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.792117 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.792139 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.792151 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.894889 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.894946 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.894965 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.894991 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.895009 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.996955 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.996983 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.996992 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.997006 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.997015 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:09Z","lastTransitionTime":"2026-03-10T21:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:09 crc kubenswrapper[4919]: I0310 21:52:09.999157 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/0.log" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.004081 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.004553 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.018524 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.041276 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"message\\\":\\\"310 21:52:08.649483 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 21:52:08.649499 6773 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:08.649511 6773 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:08.649515 6773 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:08.649539 6773 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:08.649551 6773 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 21:52:08.649555 6773 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 21:52:08.649598 6773 factory.go:656] Stopping watch factory\\\\nI0310 21:52:08.649614 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 21:52:08.649674 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 21:52:08.649684 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 21:52:08.649689 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:08.649695 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:08.649702 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:08.649707 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:08.649713 6773 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.056586 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.084885 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.099556 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.099603 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.099620 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.099639 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.099654 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.106910 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.122234 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.134479 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.148108 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.158476 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.171615 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.183217 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.200543 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.202185 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.202242 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.202265 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.202290 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.202309 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.219066 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.233304 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.248866 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:10Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.304108 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.304156 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.304170 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.304265 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.304279 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.402481 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.402657 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402699 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:52:42.40267288 +0000 UTC m=+149.644553488 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.402739 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.402783 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.402836 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402858 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402881 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402911 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402931 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:52:42.402906956 +0000 UTC m=+149.644787594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402932 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402957 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402997 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:52:42.402976748 +0000 UTC m=+149.644857386 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.402963 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.403026 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:52:42.403013549 +0000 UTC m=+149.644894197 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.403030 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.403049 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.403083 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:52:42.40307111 +0000 UTC m=+149.644951738 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.407630 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.407671 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.407691 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.407714 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.407732 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.479374 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.479461 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.479493 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.479551 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.479596 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:10 crc kubenswrapper[4919]: E0310 21:52:10.479674 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.510523 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.510582 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.510604 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.510631 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.510655 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.613691 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.613732 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.613743 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.613761 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.613773 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.716160 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.716188 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.716197 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.716209 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.716220 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.818880 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.818947 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.818968 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.818992 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.819011 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.922270 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.922325 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.922347 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.922374 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:10 crc kubenswrapper[4919]: I0310 21:52:10.922440 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:10Z","lastTransitionTime":"2026-03-10T21:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.011073 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/1.log" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.012365 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/0.log" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.016768 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9" exitCode=1 Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.016822 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.016882 4919 scope.go:117] "RemoveContainer" containerID="6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.018767 4919 scope.go:117] "RemoveContainer" containerID="47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.019031 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.028094 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.028155 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.028178 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.028208 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.028228 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.034892 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q"] Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.035380 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.038689 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.038793 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.047934 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.068888 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.084759 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.096631 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.111057 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.111189 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.111338 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.111424 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm5jv\" (UniqueName: \"kubernetes.io/projected/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-kube-api-access-bm5jv\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.123839 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"message\\\":\\\"310 21:52:08.649483 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 21:52:08.649499 6773 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:08.649511 6773 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:08.649515 6773 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:08.649539 6773 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:08.649551 6773 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 21:52:08.649555 6773 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 21:52:08.649598 6773 factory.go:656] Stopping watch factory\\\\nI0310 21:52:08.649614 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 21:52:08.649674 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 21:52:08.649684 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 21:52:08.649689 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:08.649695 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:08.649702 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:08.649707 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:08.649713 6773 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.139268 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.139326 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.139344 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.139369 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.139423 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.143875 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.178479 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.203119 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.212451 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.212499 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.212583 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.212605 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm5jv\" (UniqueName: \"kubernetes.io/projected/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-kube-api-access-bm5jv\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.213577 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.214109 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.217811 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.223257 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.234353 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.237227 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm5jv\" (UniqueName: \"kubernetes.io/projected/8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7-kube-api-access-bm5jv\") pod \"ovnkube-control-plane-749d76644c-zv56q\" (UID: \"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.242567 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.242599 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.242612 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.242629 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.242641 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.254179 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.266285 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.280714 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.298748 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.316632 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.331348 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.344994 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.345041 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.345057 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.345079 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.345095 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.346333 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.354868 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.358686 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: W0310 21:52:11.373570 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c57707d_1414_4a4a_ac8a_0fadb2fbe5f7.slice/crio-e9c1b652945e7e965a162f78725b6ac4b745a1679c406014f72d7dd57c8bf9a2 WatchSource:0}: Error finding container e9c1b652945e7e965a162f78725b6ac4b745a1679c406014f72d7dd57c8bf9a2: Status 404 returned error can't find the container with id e9c1b652945e7e965a162f78725b6ac4b745a1679c406014f72d7dd57c8bf9a2 Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.380126 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.397751 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.414239 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.434718 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.446277 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.447324 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.447356 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.447366 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.447380 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.447406 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.456952 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.477776 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"message\\\":\\\"310 21:52:08.649483 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 21:52:08.649499 6773 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:08.649511 6773 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:08.649515 6773 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:08.649539 6773 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:08.649551 6773 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 21:52:08.649555 6773 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 21:52:08.649598 6773 factory.go:656] Stopping watch factory\\\\nI0310 21:52:08.649614 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 21:52:08.649674 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 21:52:08.649684 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 21:52:08.649689 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:08.649695 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:08.649702 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:08.649707 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:08.649713 6773 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.488229 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.509667 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.530681 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.542438 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.551440 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.551466 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.551479 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.551494 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.551506 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.560469 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.572334 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.655878 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.656048 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.656138 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.656226 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.656336 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.759746 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.759792 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.759807 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.759826 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.759840 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.762951 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ckwhl"] Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.763500 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.763577 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.778710 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.790227 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.813135 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.813187 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.813204 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.813227 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.813246 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.819236 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.819319 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnqnh\" (UniqueName: \"kubernetes.io/projected/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-kube-api-access-nnqnh\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.829908 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.830893 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.837648 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.837692 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.837706 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.837727 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.837739 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.850228 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.858758 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.866737 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.866771 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.866781 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.866794 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.866803 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.871441 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.883933 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.884789 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.890350 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.890479 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.890542 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.890558 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.890573 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.904125 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.909114 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.912744 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.912790 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.912806 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.912829 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.912845 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.920043 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.920155 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnqnh\" (UniqueName: \"kubernetes.io/projected/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-kube-api-access-nnqnh\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.920275 4919 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.920368 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs podName:a95e8b73-ffed-4248-b8ba-99fc7c5b900f nodeName:}" failed. No retries permitted until 2026-03-10 21:52:12.420345946 +0000 UTC m=+119.662226564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs") pod "network-metrics-daemon-ckwhl" (UID: "a95e8b73-ffed-4248-b8ba-99fc7c5b900f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.924555 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ba4289120330c49ae9af7da81d7a041beb11672e2f1bbe9693a69c0e060b437\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"message\\\":\\\"310 21:52:08.649483 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 21:52:08.649499 6773 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:08.649511 6773 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:08.649515 6773 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:08.649539 6773 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:08.649551 6773 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 21:52:08.649555 6773 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 21:52:08.649598 6773 factory.go:656] Stopping watch factory\\\\nI0310 21:52:08.649614 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 21:52:08.649674 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 21:52:08.649684 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 21:52:08.649689 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:08.649695 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:08.649702 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:08.649707 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:08.649713 6773 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.927130 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: E0310 21:52:11.927295 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.932680 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.932720 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.932729 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.932741 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.932750 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:11Z","lastTransitionTime":"2026-03-10T21:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.942275 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.942629 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnqnh\" (UniqueName: \"kubernetes.io/projected/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-kube-api-access-nnqnh\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.953781 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.965308 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.978242 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:11 crc kubenswrapper[4919]: I0310 21:52:11.992841 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:11Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.004029 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.017684 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.023194 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/1.log" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.026619 4919 scope.go:117] "RemoveContainer" containerID="47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9" Mar 10 21:52:12 crc kubenswrapper[4919]: E0310 21:52:12.026789 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.028615 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" event={"ID":"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7","Type":"ContainerStarted","Data":"ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.028650 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" event={"ID":"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7","Type":"ContainerStarted","Data":"a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.028665 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" event={"ID":"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7","Type":"ContainerStarted","Data":"e9c1b652945e7e965a162f78725b6ac4b745a1679c406014f72d7dd57c8bf9a2"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.034730 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.034771 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.034693 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.034788 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.034808 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.034825 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.046084 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.057435 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.068028 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.080970 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.089943 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.108773 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.128540 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.137364 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.137627 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.137754 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.137897 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.138059 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.142599 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.154710 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.167233 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.193897 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.203740 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.220770 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.231571 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.240128 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.240157 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.240168 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.240184 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.240194 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.248238 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.266229 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.286237 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.298524 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:12Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.342254 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.342513 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.342603 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.342695 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.342785 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.425047 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:12 crc kubenswrapper[4919]: E0310 21:52:12.425205 4919 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:12 crc kubenswrapper[4919]: E0310 21:52:12.425371 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs podName:a95e8b73-ffed-4248-b8ba-99fc7c5b900f nodeName:}" failed. No retries permitted until 2026-03-10 21:52:13.425353804 +0000 UTC m=+120.667234412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs") pod "network-metrics-daemon-ckwhl" (UID: "a95e8b73-ffed-4248-b8ba-99fc7c5b900f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.444791 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.444819 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.444830 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.444845 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.444855 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.479352 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:12 crc kubenswrapper[4919]: E0310 21:52:12.479550 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.479378 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.479380 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:12 crc kubenswrapper[4919]: E0310 21:52:12.479853 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:12 crc kubenswrapper[4919]: E0310 21:52:12.480040 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.548175 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.548241 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.548283 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.548317 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.548340 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.651360 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.651449 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.651469 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.651493 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.651511 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.754493 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.754530 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.754538 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.754551 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.754561 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.857222 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.857258 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.857268 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.857284 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.857296 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.960572 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.960625 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.960642 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.960662 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:12 crc kubenswrapper[4919]: I0310 21:52:12.960674 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:12Z","lastTransitionTime":"2026-03-10T21:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.064044 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.064121 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.064150 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.064182 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.064207 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:13Z","lastTransitionTime":"2026-03-10T21:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.169588 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.169646 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.169663 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.169688 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.169705 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:13Z","lastTransitionTime":"2026-03-10T21:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.272297 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.272354 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.272368 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.272419 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.272436 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:13Z","lastTransitionTime":"2026-03-10T21:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.376030 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.376093 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.376114 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.376139 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.376158 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:13Z","lastTransitionTime":"2026-03-10T21:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.434493 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:13 crc kubenswrapper[4919]: E0310 21:52:13.434651 4919 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:13 crc kubenswrapper[4919]: E0310 21:52:13.434718 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs podName:a95e8b73-ffed-4248-b8ba-99fc7c5b900f nodeName:}" failed. No retries permitted until 2026-03-10 21:52:15.434700332 +0000 UTC m=+122.676580940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs") pod "network-metrics-daemon-ckwhl" (UID: "a95e8b73-ffed-4248-b8ba-99fc7c5b900f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:13 crc kubenswrapper[4919]: E0310 21:52:13.476370 4919 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.479042 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:13 crc kubenswrapper[4919]: E0310 21:52:13.479264 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.496072 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.514212 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.529952 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.546969 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.560964 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.577601 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: E0310 21:52:13.579776 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.595586 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.612006 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.623982 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.651949 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.668679 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.683353 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.714280 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.735490 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.755887 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.773472 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:13 crc kubenswrapper[4919]: I0310 21:52:13.786446 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:14 crc kubenswrapper[4919]: I0310 21:52:14.479993 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:14 crc kubenswrapper[4919]: I0310 21:52:14.480061 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:14 crc kubenswrapper[4919]: E0310 21:52:14.480325 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:14 crc kubenswrapper[4919]: I0310 21:52:14.480451 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:14 crc kubenswrapper[4919]: E0310 21:52:14.480522 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:14 crc kubenswrapper[4919]: E0310 21:52:14.480667 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:14 crc kubenswrapper[4919]: I0310 21:52:14.492798 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 21:52:15 crc kubenswrapper[4919]: I0310 21:52:15.457031 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:15 crc kubenswrapper[4919]: E0310 21:52:15.457270 4919 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:15 crc kubenswrapper[4919]: E0310 21:52:15.457479 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs podName:a95e8b73-ffed-4248-b8ba-99fc7c5b900f nodeName:}" failed. No retries permitted until 2026-03-10 21:52:19.457382138 +0000 UTC m=+126.699262816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs") pod "network-metrics-daemon-ckwhl" (UID: "a95e8b73-ffed-4248-b8ba-99fc7c5b900f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:15 crc kubenswrapper[4919]: I0310 21:52:15.479596 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:15 crc kubenswrapper[4919]: E0310 21:52:15.479776 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:16 crc kubenswrapper[4919]: I0310 21:52:16.479793 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:16 crc kubenswrapper[4919]: I0310 21:52:16.479865 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:16 crc kubenswrapper[4919]: I0310 21:52:16.479805 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:16 crc kubenswrapper[4919]: E0310 21:52:16.479991 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:16 crc kubenswrapper[4919]: E0310 21:52:16.480169 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:16 crc kubenswrapper[4919]: E0310 21:52:16.480603 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:17 crc kubenswrapper[4919]: I0310 21:52:17.479000 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:17 crc kubenswrapper[4919]: E0310 21:52:17.479506 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:18 crc kubenswrapper[4919]: I0310 21:52:18.479362 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:18 crc kubenswrapper[4919]: E0310 21:52:18.479570 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:18 crc kubenswrapper[4919]: I0310 21:52:18.479794 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:18 crc kubenswrapper[4919]: E0310 21:52:18.479955 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:18 crc kubenswrapper[4919]: I0310 21:52:18.479797 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:18 crc kubenswrapper[4919]: E0310 21:52:18.480189 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:18 crc kubenswrapper[4919]: E0310 21:52:18.581727 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.085116 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.108912 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.125970 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.144141 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.164824 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.181177 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.215160 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.236777 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.256439 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.315905 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.329836 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.360659 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.376116 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.393178 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.407592 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.427563 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.447878 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.474062 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.479816 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:19 crc kubenswrapper[4919]: E0310 21:52:19.479982 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.492299 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:19Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:19 crc kubenswrapper[4919]: I0310 21:52:19.498866 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:19 crc kubenswrapper[4919]: E0310 21:52:19.499064 4919 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:19 crc kubenswrapper[4919]: E0310 21:52:19.499187 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs podName:a95e8b73-ffed-4248-b8ba-99fc7c5b900f nodeName:}" failed. No retries permitted until 2026-03-10 21:52:27.499156091 +0000 UTC m=+134.741036739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs") pod "network-metrics-daemon-ckwhl" (UID: "a95e8b73-ffed-4248-b8ba-99fc7c5b900f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:20 crc kubenswrapper[4919]: I0310 21:52:20.479021 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:20 crc kubenswrapper[4919]: E0310 21:52:20.479182 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:20 crc kubenswrapper[4919]: I0310 21:52:20.479416 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:20 crc kubenswrapper[4919]: E0310 21:52:20.479479 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:20 crc kubenswrapper[4919]: I0310 21:52:20.479909 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:20 crc kubenswrapper[4919]: E0310 21:52:20.480073 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:21 crc kubenswrapper[4919]: I0310 21:52:21.479572 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:21 crc kubenswrapper[4919]: E0310 21:52:21.479868 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.193878 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.193934 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.193953 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.193978 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.193996 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:22Z","lastTransitionTime":"2026-03-10T21:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.214215 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:22Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.219904 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.219980 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.219999 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.220024 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.220045 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:22Z","lastTransitionTime":"2026-03-10T21:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.239499 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:22Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.244301 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.244346 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.244355 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.244372 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.244385 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:22Z","lastTransitionTime":"2026-03-10T21:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.262144 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:22Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.267475 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.267535 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.267555 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.267579 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.267597 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:22Z","lastTransitionTime":"2026-03-10T21:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.286215 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:22Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.290696 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.290773 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.290789 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.290833 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.290868 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:22Z","lastTransitionTime":"2026-03-10T21:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.305327 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:22Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.305516 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.479740 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.479847 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.479935 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:22 crc kubenswrapper[4919]: I0310 21:52:22.480009 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.480173 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:22 crc kubenswrapper[4919]: E0310 21:52:22.480460 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.479583 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:23 crc kubenswrapper[4919]: E0310 21:52:23.479789 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.499156 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.516125 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.551599 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.567944 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.581029 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: E0310 21:52:23.582163 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.608127 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.621906 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.639910 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.660123 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.670695 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.681886 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.700739 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.720920 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.744529 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.763017 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.780295 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.794531 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:23 crc kubenswrapper[4919]: I0310 21:52:23.810849 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:23Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:24 crc kubenswrapper[4919]: I0310 21:52:24.478932 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:24 crc kubenswrapper[4919]: I0310 21:52:24.478987 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:24 crc kubenswrapper[4919]: I0310 21:52:24.479086 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:24 crc kubenswrapper[4919]: E0310 21:52:24.479193 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:24 crc kubenswrapper[4919]: E0310 21:52:24.479326 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:24 crc kubenswrapper[4919]: E0310 21:52:24.479451 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:25 crc kubenswrapper[4919]: I0310 21:52:25.479706 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:25 crc kubenswrapper[4919]: E0310 21:52:25.479939 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:26 crc kubenswrapper[4919]: I0310 21:52:26.479608 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:26 crc kubenswrapper[4919]: I0310 21:52:26.479704 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:26 crc kubenswrapper[4919]: I0310 21:52:26.480038 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:26 crc kubenswrapper[4919]: E0310 21:52:26.480255 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:26 crc kubenswrapper[4919]: E0310 21:52:26.480284 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:26 crc kubenswrapper[4919]: E0310 21:52:26.480573 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:26 crc kubenswrapper[4919]: I0310 21:52:26.480667 4919 scope.go:117] "RemoveContainer" containerID="47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.083644 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/1.log" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.086008 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd"} Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.086357 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.101929 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.116066 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.135517 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.148588 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.164102 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.179010 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.192459 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.213502 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.236354 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.259034 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.270501 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.294548 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.308797 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.321956 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.336503 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.349366 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.365617 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.377034 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:27Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.479901 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:27 crc kubenswrapper[4919]: E0310 21:52:27.480109 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:27 crc kubenswrapper[4919]: I0310 21:52:27.585111 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:27 crc kubenswrapper[4919]: E0310 21:52:27.585249 4919 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:27 crc kubenswrapper[4919]: E0310 21:52:27.585300 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs podName:a95e8b73-ffed-4248-b8ba-99fc7c5b900f nodeName:}" failed. No retries permitted until 2026-03-10 21:52:43.58528685 +0000 UTC m=+150.827167458 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs") pod "network-metrics-daemon-ckwhl" (UID: "a95e8b73-ffed-4248-b8ba-99fc7c5b900f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.092344 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/2.log" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.093529 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/1.log" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.098081 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd" exitCode=1 Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.098139 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd"} Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.098197 4919 scope.go:117] "RemoveContainer" containerID="47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.099198 4919 scope.go:117] "RemoveContainer" containerID="48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd" Mar 10 21:52:28 crc kubenswrapper[4919]: E0310 21:52:28.099477 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.122505 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.140270 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.171856 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.191786 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.212129 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.231128 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.250697 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.269541 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.289223 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.305272 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.324046 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.342049 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.359572 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.389377 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47d3324e3001c164e3ec82718c91df17d64d84bbd2a08159c3bcacc8518a18f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:10Z\\\",\\\"message\\\":\\\"pointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085044 6945 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.085924 6945 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 21:52:10.085952 6945 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:10.085958 6945 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:10.085978 6945 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:10.085991 6945 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 21:52:10.085999 6945 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:10.086373 6945 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.086607 6945 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087427 6945 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 21:52:10.087806 6945 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 21:52:10.087867 6945 factory.go:656] Stopping watch factory\\\\nI0310 21:52:10.087885 6945 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.405565 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.420834 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.453536 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.475274 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:28Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.479345 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.479374 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.479654 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:28 crc kubenswrapper[4919]: E0310 21:52:28.479821 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:28 crc kubenswrapper[4919]: E0310 21:52:28.479990 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:28 crc kubenswrapper[4919]: E0310 21:52:28.480142 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:28 crc kubenswrapper[4919]: I0310 21:52:28.493308 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 21:52:28 crc kubenswrapper[4919]: E0310 21:52:28.583425 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.105122 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/2.log" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.111614 4919 scope.go:117] "RemoveContainer" containerID="48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd" Mar 10 21:52:29 crc kubenswrapper[4919]: E0310 21:52:29.111941 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.132778 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.152461 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.170576 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.186850 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.222901 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.239314 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.255705 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.287628 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.309066 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.326144 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.345484 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.369856 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.387737 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.406934 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.427165 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.444597 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.464733 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.480178 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:29 crc kubenswrapper[4919]: E0310 21:52:29.480438 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.483969 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:29 crc kubenswrapper[4919]: I0310 21:52:29.504494 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:29Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:30 crc kubenswrapper[4919]: I0310 21:52:30.479935 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:30 crc kubenswrapper[4919]: I0310 21:52:30.480061 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:30 crc kubenswrapper[4919]: E0310 21:52:30.480219 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:30 crc kubenswrapper[4919]: I0310 21:52:30.480551 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:30 crc kubenswrapper[4919]: E0310 21:52:30.480716 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:30 crc kubenswrapper[4919]: E0310 21:52:30.480820 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:31 crc kubenswrapper[4919]: I0310 21:52:31.479689 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:31 crc kubenswrapper[4919]: E0310 21:52:31.479926 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.479103 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.479224 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.479098 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.479295 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.479457 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.479669 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.501903 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.502011 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.502033 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.502059 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.502077 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:32Z","lastTransitionTime":"2026-03-10T21:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.522970 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:32Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.527859 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.527915 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.527933 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.527956 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.527974 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:32Z","lastTransitionTime":"2026-03-10T21:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.547070 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:32Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.551821 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.551866 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.551883 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.551905 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.551921 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:32Z","lastTransitionTime":"2026-03-10T21:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.573583 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:32Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.580184 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.580241 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.580264 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.580293 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.580318 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:32Z","lastTransitionTime":"2026-03-10T21:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.605983 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:32Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.610922 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.610948 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.610958 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.610973 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:32 crc kubenswrapper[4919]: I0310 21:52:32.610984 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:32Z","lastTransitionTime":"2026-03-10T21:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.631196 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:32Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:32 crc kubenswrapper[4919]: E0310 21:52:32.631354 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.479361 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:33 crc kubenswrapper[4919]: E0310 21:52:33.479643 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.501577 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.521601 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.537705 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.569202 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: E0310 21:52:33.584349 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.590385 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.612341 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.648950 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.671448 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.693185 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.711272 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.737314 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.759063 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.778284 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.796881 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.816261 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.838479 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.857632 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.879209 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:33 crc kubenswrapper[4919]: I0310 21:52:33.897921 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:33Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:34 crc kubenswrapper[4919]: I0310 21:52:34.479494 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:34 crc kubenswrapper[4919]: I0310 21:52:34.479595 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:34 crc kubenswrapper[4919]: E0310 21:52:34.479676 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:34 crc kubenswrapper[4919]: I0310 21:52:34.479733 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:34 crc kubenswrapper[4919]: E0310 21:52:34.479907 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:34 crc kubenswrapper[4919]: E0310 21:52:34.479966 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:35 crc kubenswrapper[4919]: I0310 21:52:35.479599 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:35 crc kubenswrapper[4919]: E0310 21:52:35.479794 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:36 crc kubenswrapper[4919]: I0310 21:52:36.479316 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:36 crc kubenswrapper[4919]: I0310 21:52:36.479451 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:36 crc kubenswrapper[4919]: I0310 21:52:36.479485 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:36 crc kubenswrapper[4919]: E0310 21:52:36.479560 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:36 crc kubenswrapper[4919]: E0310 21:52:36.479689 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:36 crc kubenswrapper[4919]: E0310 21:52:36.479884 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:37 crc kubenswrapper[4919]: I0310 21:52:37.483671 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:37 crc kubenswrapper[4919]: E0310 21:52:37.483878 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:38 crc kubenswrapper[4919]: I0310 21:52:38.479278 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:38 crc kubenswrapper[4919]: I0310 21:52:38.479373 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:38 crc kubenswrapper[4919]: E0310 21:52:38.479493 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:38 crc kubenswrapper[4919]: I0310 21:52:38.479656 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:38 crc kubenswrapper[4919]: E0310 21:52:38.479845 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:38 crc kubenswrapper[4919]: E0310 21:52:38.479969 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:38 crc kubenswrapper[4919]: E0310 21:52:38.585835 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:39 crc kubenswrapper[4919]: I0310 21:52:39.479731 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:39 crc kubenswrapper[4919]: E0310 21:52:39.479955 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:40 crc kubenswrapper[4919]: I0310 21:52:40.479883 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:40 crc kubenswrapper[4919]: I0310 21:52:40.479955 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:40 crc kubenswrapper[4919]: E0310 21:52:40.480066 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:40 crc kubenswrapper[4919]: I0310 21:52:40.479954 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:40 crc kubenswrapper[4919]: E0310 21:52:40.480159 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:40 crc kubenswrapper[4919]: E0310 21:52:40.480451 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:41 crc kubenswrapper[4919]: I0310 21:52:41.479349 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:41 crc kubenswrapper[4919]: E0310 21:52:41.479589 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:42 crc kubenswrapper[4919]: I0310 21:52:42.416869 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417037 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:46.417004271 +0000 UTC m=+213.658884919 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:52:42 crc kubenswrapper[4919]: I0310 21:52:42.417093 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:42 crc kubenswrapper[4919]: I0310 21:52:42.417176 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:42 crc kubenswrapper[4919]: I0310 21:52:42.417218 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:42 crc kubenswrapper[4919]: I0310 21:52:42.417259 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417316 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417349 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417368 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417442 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417468 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417520 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417557 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417581 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417479 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:46.417458783 +0000 UTC m=+213.659339431 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417676 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:46.417640358 +0000 UTC m=+213.659521006 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.417981 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:46.417960216 +0000 UTC m=+213.659840864 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.418023 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:46.418006508 +0000 UTC m=+213.659887226 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:52:42 crc kubenswrapper[4919]: I0310 21:52:42.479968 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:42 crc kubenswrapper[4919]: I0310 21:52:42.480064 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:42 crc kubenswrapper[4919]: I0310 21:52:42.480062 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.480316 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.480538 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:42 crc kubenswrapper[4919]: E0310 21:52:42.480732 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.029894 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.029939 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.029951 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.029968 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.029981 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:43Z","lastTransitionTime":"2026-03-10T21:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.045875 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.050477 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.050518 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.050530 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.050548 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.050560 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:43Z","lastTransitionTime":"2026-03-10T21:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.064119 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.069108 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.069176 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.069205 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.069233 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.069255 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:43Z","lastTransitionTime":"2026-03-10T21:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.088972 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.093212 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.093281 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.093305 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.093333 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.093352 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:43Z","lastTransitionTime":"2026-03-10T21:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.108895 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.114310 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.114383 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.114441 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.114475 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.114503 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:43Z","lastTransitionTime":"2026-03-10T21:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.129965 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.130137 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.479531 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.479924 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.480110 4919 scope.go:117] "RemoveContainer" containerID="48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd" Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.480357 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.497470 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.521931 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.537314 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.560146 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.577326 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.586412 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.594785 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.615965 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.629797 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.629991 4919 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:43 crc kubenswrapper[4919]: E0310 21:52:43.630075 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs podName:a95e8b73-ffed-4248-b8ba-99fc7c5b900f nodeName:}" failed. No retries permitted until 2026-03-10 21:53:15.630044155 +0000 UTC m=+182.871924803 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs") pod "network-metrics-daemon-ckwhl" (UID: "a95e8b73-ffed-4248-b8ba-99fc7c5b900f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.635796 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.653718 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.681947 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.698317 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.715522 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.752081 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.775309 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.799606 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.825426 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.842184 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.864162 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:43 crc kubenswrapper[4919]: I0310 21:52:43.880521 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:43Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:44 crc kubenswrapper[4919]: I0310 21:52:44.479251 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:44 crc kubenswrapper[4919]: I0310 21:52:44.479710 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:44 crc kubenswrapper[4919]: E0310 21:52:44.480587 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:44 crc kubenswrapper[4919]: I0310 21:52:44.480150 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:44 crc kubenswrapper[4919]: E0310 21:52:44.481159 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:44 crc kubenswrapper[4919]: E0310 21:52:44.479899 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:45 crc kubenswrapper[4919]: I0310 21:52:45.479755 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:45 crc kubenswrapper[4919]: E0310 21:52:45.479949 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.174769 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/0.log" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.174828 4919 generic.go:334] "Generic (PLEG): container finished" podID="6a5db7c3-2a96-4030-8c88-5d82d325b62d" containerID="abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697" exitCode=1 Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.174864 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbw8v" event={"ID":"6a5db7c3-2a96-4030-8c88-5d82d325b62d","Type":"ContainerDied","Data":"abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697"} Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.175348 4919 scope.go:117] "RemoveContainer" containerID="abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.195841 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.217929 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.239618 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.257761 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.282741 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.304460 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:45Z\\\",\\\"message\\\":\\\"2026-03-10T21:52:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d\\\\n2026-03-10T21:52:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d to /host/opt/cni/bin/\\\\n2026-03-10T21:52:00Z [verbose] multus-daemon started\\\\n2026-03-10T21:52:00Z [verbose] Readiness Indicator file check\\\\n2026-03-10T21:52:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.321753 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.340464 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.353723 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.369445 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.382484 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.394827 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.414335 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.432420 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.451657 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.479497 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.479589 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.479729 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:46 crc kubenswrapper[4919]: E0310 21:52:46.479930 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:46 crc kubenswrapper[4919]: E0310 21:52:46.480099 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:46 crc kubenswrapper[4919]: E0310 21:52:46.480288 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.483501 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.505107 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.526960 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:46 crc kubenswrapper[4919]: I0310 21:52:46.543271 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:46Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.180977 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/0.log" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.181043 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbw8v" event={"ID":"6a5db7c3-2a96-4030-8c88-5d82d325b62d","Type":"ContainerStarted","Data":"0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc"} Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.205119 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.221412 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.237137 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.254092 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.275586 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.296476 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.317138 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.338723 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:45Z\\\",\\\"message\\\":\\\"2026-03-10T21:52:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d\\\\n2026-03-10T21:52:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d to /host/opt/cni/bin/\\\\n2026-03-10T21:52:00Z [verbose] multus-daemon started\\\\n2026-03-10T21:52:00Z [verbose] Readiness Indicator file check\\\\n2026-03-10T21:52:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.357299 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.378761 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.398690 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.417408 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.431956 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.444334 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.467723 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.479420 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:47 crc kubenswrapper[4919]: E0310 21:52:47.479653 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.488218 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.506249 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.535022 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:47 crc kubenswrapper[4919]: I0310 21:52:47.557346 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:47Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:48 crc kubenswrapper[4919]: I0310 21:52:48.479105 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:48 crc kubenswrapper[4919]: I0310 21:52:48.479138 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:48 crc kubenswrapper[4919]: E0310 21:52:48.479292 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:48 crc kubenswrapper[4919]: I0310 21:52:48.479426 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:48 crc kubenswrapper[4919]: E0310 21:52:48.479574 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:48 crc kubenswrapper[4919]: E0310 21:52:48.479765 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:48 crc kubenswrapper[4919]: E0310 21:52:48.588608 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:49 crc kubenswrapper[4919]: I0310 21:52:49.479821 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:49 crc kubenswrapper[4919]: E0310 21:52:49.479966 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:50 crc kubenswrapper[4919]: I0310 21:52:50.479515 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:50 crc kubenswrapper[4919]: I0310 21:52:50.479550 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:50 crc kubenswrapper[4919]: I0310 21:52:50.479556 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:50 crc kubenswrapper[4919]: E0310 21:52:50.479728 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:50 crc kubenswrapper[4919]: E0310 21:52:50.479877 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:50 crc kubenswrapper[4919]: E0310 21:52:50.479992 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:51 crc kubenswrapper[4919]: I0310 21:52:51.479378 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:51 crc kubenswrapper[4919]: E0310 21:52:51.479700 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:52 crc kubenswrapper[4919]: I0310 21:52:52.479381 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:52 crc kubenswrapper[4919]: I0310 21:52:52.479488 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:52 crc kubenswrapper[4919]: I0310 21:52:52.479556 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:52 crc kubenswrapper[4919]: E0310 21:52:52.479680 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:52 crc kubenswrapper[4919]: E0310 21:52:52.479821 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:52 crc kubenswrapper[4919]: E0310 21:52:52.480006 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.273830 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.273880 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.273899 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.273929 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.273947 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:53Z","lastTransitionTime":"2026-03-10T21:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:53 crc kubenswrapper[4919]: E0310 21:52:53.294712 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.298969 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.299023 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.299040 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.299065 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.299082 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:53Z","lastTransitionTime":"2026-03-10T21:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:53 crc kubenswrapper[4919]: E0310 21:52:53.318440 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.322747 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.322961 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.323100 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.323259 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.323438 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:53Z","lastTransitionTime":"2026-03-10T21:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:53 crc kubenswrapper[4919]: E0310 21:52:53.337849 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.342380 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.342463 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.342483 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.342868 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.342927 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:53Z","lastTransitionTime":"2026-03-10T21:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:53 crc kubenswrapper[4919]: E0310 21:52:53.358127 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.363303 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.363522 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.363684 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.363855 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.364135 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:52:53Z","lastTransitionTime":"2026-03-10T21:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:52:53 crc kubenswrapper[4919]: E0310 21:52:53.384222 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: E0310 21:52:53.384479 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.479534 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:53 crc kubenswrapper[4919]: E0310 21:52:53.479763 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.501311 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.520706 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.537929 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.570130 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: E0310 21:52:53.590225 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.592554 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.609552 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.644950 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.668031 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.690721 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.705735 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.729368 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.746709 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.762879 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.781134 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.795515 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.815127 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:45Z\\\",\\\"message\\\":\\\"2026-03-10T21:52:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d\\\\n2026-03-10T21:52:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d to /host/opt/cni/bin/\\\\n2026-03-10T21:52:00Z [verbose] multus-daemon started\\\\n2026-03-10T21:52:00Z [verbose] Readiness Indicator file check\\\\n2026-03-10T21:52:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.831346 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.848018 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:53 crc kubenswrapper[4919]: I0310 21:52:53.861928 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:53Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:54 crc kubenswrapper[4919]: I0310 21:52:54.479351 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:54 crc kubenswrapper[4919]: I0310 21:52:54.479387 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:54 crc kubenswrapper[4919]: E0310 21:52:54.479993 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:54 crc kubenswrapper[4919]: I0310 21:52:54.480026 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:54 crc kubenswrapper[4919]: E0310 21:52:54.480574 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:54 crc kubenswrapper[4919]: E0310 21:52:54.480514 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:54 crc kubenswrapper[4919]: I0310 21:52:54.481031 4919 scope.go:117] "RemoveContainer" containerID="48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.212810 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/2.log" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.215859 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.216618 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.232689 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.248264 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.275044 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.290458 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.304096 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.318342 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.330680 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.363652 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.379469 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.393843 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.412796 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.429305 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.444138 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.463234 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.477455 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.479761 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:55 crc kubenswrapper[4919]: E0310 21:52:55.479970 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.491642 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.505503 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.519903 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:55 crc kubenswrapper[4919]: I0310 21:52:55.536940 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:45Z\\\",\\\"message\\\":\\\"2026-03-10T21:52:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d\\\\n2026-03-10T21:52:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d to /host/opt/cni/bin/\\\\n2026-03-10T21:52:00Z [verbose] multus-daemon started\\\\n2026-03-10T21:52:00Z [verbose] Readiness Indicator file check\\\\n2026-03-10T21:52:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:55Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.223572 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/3.log" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.224749 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/2.log" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.230584 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" exitCode=1 Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.230635 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.230685 4919 scope.go:117] "RemoveContainer" containerID="48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.232013 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 21:52:56 crc kubenswrapper[4919]: E0310 21:52:56.232350 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.256162 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.273193 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.294217 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.313618 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.334326 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.358621 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.376032 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.394211 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.416637 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.435441 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.455427 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:45Z\\\",\\\"message\\\":\\\"2026-03-10T21:52:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d\\\\n2026-03-10T21:52:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d to /host/opt/cni/bin/\\\\n2026-03-10T21:52:00Z [verbose] multus-daemon started\\\\n2026-03-10T21:52:00Z [verbose] Readiness Indicator file check\\\\n2026-03-10T21:52:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.471838 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.479331 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.479371 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.479423 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:56 crc kubenswrapper[4919]: E0310 21:52:56.480533 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:56 crc kubenswrapper[4919]: E0310 21:52:56.480692 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:56 crc kubenswrapper[4919]: E0310 21:52:56.480908 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.505124 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48b981072e2d3ee5b692dec159c8bcd8cdaba247ff36084b942813175ca23afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:27Z\\\",\\\"message\\\":\\\"g-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"78f6184b-c7cf-436d-8cbb-4b31f8af75e8\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/catalog-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.204\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0310 21:52:27.442701 7191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:55Z\\\",\\\"message\\\":\\\"e:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 21:52:55.377474 7514 obj_retry.go:551] Creating *factory.egressNode crc took: 8.606133ms\\\\nI0310 21:52:55.377502 7514 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 21:52:55.377532 7514 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 21:52:55.377571 7514 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:55.377595 7514 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:55.377635 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:55.377687 7514 factory.go:656] Stopping watch factory\\\\nI0310 21:52:55.377717 7514 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:55.377738 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:55.377751 7514 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:55.377766 7514 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 21:52:55.377839 7514 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 21:52:55.377871 7514 ovnkube.go:599] Stopped ovnkube\\\\nI0310 21:52:55.377892 7514 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 21:52:55.377995 7514 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.522031 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.537479 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.570822 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.593702 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.613300 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:56 crc kubenswrapper[4919]: I0310 21:52:56.632941 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:56Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.237011 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/3.log" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.241601 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 21:52:57 crc kubenswrapper[4919]: E0310 21:52:57.241935 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.262672 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.281350 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.310051 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:45Z\\\",\\\"message\\\":\\\"2026-03-10T21:52:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d\\\\n2026-03-10T21:52:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d to /host/opt/cni/bin/\\\\n2026-03-10T21:52:00Z [verbose] multus-daemon started\\\\n2026-03-10T21:52:00Z [verbose] Readiness Indicator file check\\\\n2026-03-10T21:52:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.329286 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.360303 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.382132 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.401895 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.419623 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.434822 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.464965 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:55Z\\\",\\\"message\\\":\\\"e:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 21:52:55.377474 7514 obj_retry.go:551] Creating *factory.egressNode crc took: 8.606133ms\\\\nI0310 21:52:55.377502 7514 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 21:52:55.377532 7514 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 21:52:55.377571 7514 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:55.377595 7514 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:55.377635 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:55.377687 7514 factory.go:656] Stopping watch factory\\\\nI0310 21:52:55.377717 7514 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:55.377738 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:55.377751 7514 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:55.377766 7514 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 21:52:55.377839 7514 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 21:52:55.377871 7514 ovnkube.go:599] Stopped ovnkube\\\\nI0310 21:52:55.377892 7514 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 21:52:55.377995 7514 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.480144 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:57 crc kubenswrapper[4919]: E0310 21:52:57.480346 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.486881 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.503682 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.521160 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.541373 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.561884 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.581885 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.604672 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.622710 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:57 crc kubenswrapper[4919]: I0310 21:52:57.644817 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:52:57Z is after 2025-08-24T17:21:41Z" Mar 10 21:52:58 crc kubenswrapper[4919]: I0310 21:52:58.479508 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:52:58 crc kubenswrapper[4919]: I0310 21:52:58.479644 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:52:58 crc kubenswrapper[4919]: E0310 21:52:58.479982 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:52:58 crc kubenswrapper[4919]: E0310 21:52:58.480122 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:52:58 crc kubenswrapper[4919]: I0310 21:52:58.479668 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:52:58 crc kubenswrapper[4919]: E0310 21:52:58.480314 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:52:58 crc kubenswrapper[4919]: E0310 21:52:58.591461 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:52:59 crc kubenswrapper[4919]: I0310 21:52:59.479243 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:52:59 crc kubenswrapper[4919]: E0310 21:52:59.479493 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:00 crc kubenswrapper[4919]: I0310 21:53:00.479298 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:00 crc kubenswrapper[4919]: I0310 21:53:00.479332 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:00 crc kubenswrapper[4919]: E0310 21:53:00.479458 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:00 crc kubenswrapper[4919]: I0310 21:53:00.479348 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:00 crc kubenswrapper[4919]: E0310 21:53:00.479576 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:00 crc kubenswrapper[4919]: E0310 21:53:00.479686 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:01 crc kubenswrapper[4919]: I0310 21:53:01.479559 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:01 crc kubenswrapper[4919]: E0310 21:53:01.480710 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:02 crc kubenswrapper[4919]: I0310 21:53:02.479606 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:02 crc kubenswrapper[4919]: I0310 21:53:02.479635 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:02 crc kubenswrapper[4919]: I0310 21:53:02.479632 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:02 crc kubenswrapper[4919]: E0310 21:53:02.479780 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:02 crc kubenswrapper[4919]: E0310 21:53:02.479939 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:02 crc kubenswrapper[4919]: E0310 21:53:02.480080 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.479311 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:03 crc kubenswrapper[4919]: E0310 21:53:03.479549 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.502323 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.523570 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.539418 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.551564 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:45Z\\\",\\\"message\\\":\\\"2026-03-10T21:52:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d\\\\n2026-03-10T21:52:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d to /host/opt/cni/bin/\\\\n2026-03-10T21:52:00Z [verbose] multus-daemon started\\\\n2026-03-10T21:52:00Z [verbose] Readiness Indicator file check\\\\n2026-03-10T21:52:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.561232 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.578767 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: E0310 21:53:03.591835 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.595001 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.609265 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.623213 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.633894 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.651610 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:55Z\\\",\\\"message\\\":\\\"e:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 21:52:55.377474 7514 obj_retry.go:551] Creating *factory.egressNode crc took: 8.606133ms\\\\nI0310 21:52:55.377502 7514 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 21:52:55.377532 7514 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 21:52:55.377571 7514 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:55.377595 7514 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:55.377635 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:55.377687 7514 factory.go:656] Stopping watch factory\\\\nI0310 21:52:55.377717 7514 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:55.377738 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:55.377751 7514 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:55.377766 7514 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 21:52:55.377839 7514 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 21:52:55.377871 7514 ovnkube.go:599] Stopped ovnkube\\\\nI0310 21:52:55.377892 7514 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 21:52:55.377995 7514 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.661324 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.673638 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937344f02f0259b0d258de35d490545dad0ce084dd49c7584002da0734cc046e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.685065 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b625p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b82448f1-4387-4d1a-a300-29f4b3d86bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5edef5b10597e404ec5599983d07529ee344e7d7f5b1a4a7f589678613034b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9q9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b625p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.702331 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d33de8e-9521-40e1-8dda-051e228ca068\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25d19a7d46abf131e552151e3bbb220e3fdf0a3bdb8ff8ca7b082dcc296408c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30803ec8ed4cbd053df2777bfe3077a7637972562508205711b357011e453dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:10Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 21:50:40.583344 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 21:50:40.584309 1 observer_polling.go:159] Starting file observer\\\\nI0310 21:50:40.585315 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 21:50:40.586027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 21:51:05.043465 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 21:51:10.132930 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 21:51:10.133005 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:40Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://038d23b7c75ae61b55b3b70b5b70de0ca4f3243d0b0a68f8bd221aff91c2c032\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b6b89388d9f9288049474f1f88faad36bcbc05564e7769c9fca8c220847efd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.712573 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.712611 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.712622 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.712635 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.712647 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:53:03Z","lastTransitionTime":"2026-03-10T21:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.726526 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: E0310 21:53:03.729797 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.734995 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.735037 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.735055 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.735082 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.735101 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:53:03Z","lastTransitionTime":"2026-03-10T21:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.744844 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: E0310 21:53:03.752792 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.756843 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.756879 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.756893 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.756909 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.756920 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:53:03Z","lastTransitionTime":"2026-03-10T21:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.765668 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ace27ab-c4c7-412b-9ae8-a3e4ff15faec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://607c25e23101a157124cb81f984fac6d36e71a08b7d990e1d11627f3a7de24b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5636b870a2c5e13abd72ff1e7c785363d11540245c14af6f5de75a0b400a6aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7cd0e8267d82b65679954a7b935e138a06a6634311569a48268ba5917d6cb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b4f2cd55ec546c6f44490159ec5b303d67b584fce7afb4b6fe3dd93e95d843\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cebcf235b45b59a23c89f44855f381557a49fc94ec9332c699274507cda4cd12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b01604ec0336bae6aa21d9b124596bb46f6351b67c413f2fade87febd0887a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3c386dab63ce40e10fc9e5617d1e0c1e87eee2a1148cc64d5faa74542014155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:52:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5jdcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z6pc7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: E0310 21:53:03.771565 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.776871 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.776934 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.776972 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.776997 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.777017 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:53:03Z","lastTransitionTime":"2026-03-10T21:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.782202 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c57707d-1414-4a4a-ac8a-0fadb2fbe5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4d38859cd120bfb7307a52fd56c1b53490e57164b68da1811475e1046de690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac5ed6a131689d58c1d7867655d836f574591f3bc397d05858cbcfb9748c5107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bm5jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zv56q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: E0310 21:53:03.797901 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.802476 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.802527 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.802543 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.802564 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:53:03 crc kubenswrapper[4919]: I0310 21:53:03.802582 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:53:03Z","lastTransitionTime":"2026-03-10T21:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:53:03 crc kubenswrapper[4919]: E0310 21:53:03.823026 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T21:53:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c22d31cd-a51d-4524-bb69-0b454ae09e98\\\",\\\"systemUUID\\\":\\\"eb24d1fd-ecd7-423c-90f7-cacacceb5386\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:03Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:03 crc kubenswrapper[4919]: E0310 21:53:03.823136 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:53:04 crc kubenswrapper[4919]: I0310 21:53:04.479539 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:04 crc kubenswrapper[4919]: I0310 21:53:04.479591 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:04 crc kubenswrapper[4919]: I0310 21:53:04.479647 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:04 crc kubenswrapper[4919]: E0310 21:53:04.479782 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:04 crc kubenswrapper[4919]: E0310 21:53:04.479873 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:04 crc kubenswrapper[4919]: E0310 21:53:04.480038 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:05 crc kubenswrapper[4919]: I0310 21:53:05.480761 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:05 crc kubenswrapper[4919]: E0310 21:53:05.480966 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:06 crc kubenswrapper[4919]: I0310 21:53:06.478933 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:06 crc kubenswrapper[4919]: I0310 21:53:06.479001 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:06 crc kubenswrapper[4919]: I0310 21:53:06.479012 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:06 crc kubenswrapper[4919]: E0310 21:53:06.479109 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:06 crc kubenswrapper[4919]: E0310 21:53:06.479273 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:06 crc kubenswrapper[4919]: E0310 21:53:06.479444 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:07 crc kubenswrapper[4919]: I0310 21:53:07.479807 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:07 crc kubenswrapper[4919]: E0310 21:53:07.480030 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:08 crc kubenswrapper[4919]: I0310 21:53:08.479233 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:08 crc kubenswrapper[4919]: I0310 21:53:08.479347 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:08 crc kubenswrapper[4919]: E0310 21:53:08.479514 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:08 crc kubenswrapper[4919]: I0310 21:53:08.479266 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:08 crc kubenswrapper[4919]: E0310 21:53:08.479637 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:08 crc kubenswrapper[4919]: E0310 21:53:08.479889 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:08 crc kubenswrapper[4919]: E0310 21:53:08.593960 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:09 crc kubenswrapper[4919]: I0310 21:53:09.479484 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:09 crc kubenswrapper[4919]: E0310 21:53:09.479928 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:10 crc kubenswrapper[4919]: I0310 21:53:10.478966 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:10 crc kubenswrapper[4919]: I0310 21:53:10.478991 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:10 crc kubenswrapper[4919]: I0310 21:53:10.479129 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:10 crc kubenswrapper[4919]: E0310 21:53:10.479166 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:10 crc kubenswrapper[4919]: E0310 21:53:10.479265 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:10 crc kubenswrapper[4919]: E0310 21:53:10.479379 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:11 crc kubenswrapper[4919]: I0310 21:53:11.479489 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:11 crc kubenswrapper[4919]: E0310 21:53:11.479722 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:11 crc kubenswrapper[4919]: I0310 21:53:11.481085 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 21:53:11 crc kubenswrapper[4919]: E0310 21:53:11.481370 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:53:12 crc kubenswrapper[4919]: I0310 21:53:12.478888 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:12 crc kubenswrapper[4919]: I0310 21:53:12.478959 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:12 crc kubenswrapper[4919]: E0310 21:53:12.479011 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:12 crc kubenswrapper[4919]: I0310 21:53:12.479093 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:12 crc kubenswrapper[4919]: E0310 21:53:12.479155 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:12 crc kubenswrapper[4919]: E0310 21:53:12.479285 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.479123 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:13 crc kubenswrapper[4919]: E0310 21:53:13.479270 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.500941 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41f60ccf-532c-42dd-85d3-5cf02206caeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff32e19d9357f72af1234677cdf04c43d15fcbb5af4faeae6db0aa9fca7e8ef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41e31e0bed022ff2ce3f4869359dfe2d6a25c0039704f26ef5c9b4be0da5b9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed85c384d715caea3fe992e40d62b467c4893c865d792f798254701b15735fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a0037afef2a95ea444616a0317a23a27f5e093c2083b8e13c6ebcec7cb26f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.516543 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8c9a9627b63f7b2a9c80571ca8f781eec442b1fd148631fa417b2e11943437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a390bbc216535df32a4dab5fb983494134d2e9f87a689ea39d3e32592ec663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.529060 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566678d1-f416-4116-ab20-b30dceb86cdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ee76064368a216672f45eb860628d301968c311e0bc75b9a73c01f351c9c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z7v4t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.544932 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbw8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a5db7c3-2a96-4030-8c88-5d82d325b62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:45Z\\\",\\\"message\\\":\\\"2026-03-10T21:52:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d\\\\n2026-03-10T21:52:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93ad3fa3-cc05-440d-a4ee-07253ec4f77d to /host/opt/cni/bin/\\\\n2026-03-10T21:52:00Z [verbose] multus-daemon started\\\\n2026-03-10T21:52:00Z [verbose] Readiness Indicator file check\\\\n2026-03-10T21:52:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwtj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbw8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.555854 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnqnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:52:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ckwhl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.567623 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0693f5-4dbc-4621-9cf6-450d64aaea59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f338ebc5fc228d07415015c51f7ed4fcc24d5bf76a644e491b5c4b9dc51b71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5205e89649c7c1e920c967873bb1a404548b539605a180b02a1c249ff499e32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.587854 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5163ae00-7b50-497d-9770-0d787026b436\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://322af039c736bcf0b853ee5527ebb1b1750484dfab074745abcd75c24fdcccbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfa2696bd9b6e5d247686e5297b6ae2f49e5b216174391f211cb2a3a4966135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c23aaef6f076ab2a428323d38fac48e0c55ad52c55a46c942bccad06474fd31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a88714c27822fd18dff500c973b9d548414d59c7666de938e3cb0c6b18e277e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67063544d268ca488af7ae401113e6f35bb48688e50f944cfa03360de376611a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc6d32c621a9826f8539d4c1806a83f694eaa8874384bd615c2fa1cee02d2e98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa3a4aef61b7421411f28beecceab476e43c52eb51ac2a257399b01e4c2bcb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2126875e4bacb36f72de8a3a8bd7c799b89e7691a83fd39d7645a97e95db91b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: E0310 21:53:13.595108 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.602570 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9ed1501-15da-4419-aa12-171e610438d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:50:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T21:51:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0310 21:51:14.180982 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 21:51:14.181120 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 21:51:14.182146 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-915015507/tls.crt::/tmp/serving-cert-915015507/tls.key\\\\\\\"\\\\nI0310 21:51:14.490188 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 21:51:14.496972 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 21:51:14.497009 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 21:51:14.497047 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 21:51:14.497058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 21:51:14.505682 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 21:51:14.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505729 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 21:51:14.505740 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 21:51:14.505737 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 21:51:14.505748 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 21:51:14.505777 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 21:51:14.505783 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 21:51:14.508219 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:50:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:50:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:50:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:50:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.615688 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.627288 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f95c272d21026474ba17da6abc519f0cc1874dbdded3e089a107b23cdd20fa6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.638171 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hzq7c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e92ce303-b70d-4416-b8f1-520b49dca2e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cc1a7ce601001a487303cfae1cef980407a59cc27d02d4f3c4a303b7668639f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qw7c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hzq7c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.661047 4919 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2e7c6fb-9e33-441d-9197-719929eb9e21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T21:52:55Z\\\",\\\"message\\\":\\\"e:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 21:52:55.377474 7514 obj_retry.go:551] Creating *factory.egressNode crc took: 8.606133ms\\\\nI0310 21:52:55.377502 7514 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 21:52:55.377532 7514 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 21:52:55.377571 7514 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 21:52:55.377595 7514 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 21:52:55.377635 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 21:52:55.377687 7514 factory.go:656] Stopping watch factory\\\\nI0310 21:52:55.377717 7514 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 21:52:55.377738 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 21:52:55.377751 7514 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 21:52:55.377766 7514 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 21:52:55.377839 7514 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 21:52:55.377871 7514 ovnkube.go:599] Stopped ovnkube\\\\nI0310 21:52:55.377892 7514 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 21:52:55.377995 7514 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T21:52:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T21:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T21:51:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T21:51:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5rvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T21:51:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4dp67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T21:53:13Z is after 2025-08-24T17:21:41Z" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.741421 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=45.741384108 podStartE2EDuration="45.741384108s" podCreationTimestamp="2026-03-10 21:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:13.741333037 +0000 UTC m=+180.983213655" watchObservedRunningTime="2026-03-10 21:53:13.741384108 +0000 UTC m=+180.983264716" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.741692 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b625p" podStartSLOduration=118.741684616 podStartE2EDuration="1m58.741684616s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:13.726822203 +0000 UTC m=+180.968702811" watchObservedRunningTime="2026-03-10 21:53:13.741684616 +0000 UTC m=+180.983565224" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.782353 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z6pc7" podStartSLOduration=118.782334876 podStartE2EDuration="1m58.782334876s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:13.781949476 +0000 UTC m=+181.023830104" watchObservedRunningTime="2026-03-10 21:53:13.782334876 +0000 UTC m=+181.024215484" Mar 10 21:53:13 crc kubenswrapper[4919]: I0310 21:53:13.794532 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zv56q" podStartSLOduration=118.794516856 podStartE2EDuration="1m58.794516856s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:13.794464404 +0000 UTC m=+181.036345032" watchObservedRunningTime="2026-03-10 21:53:13.794516856 +0000 UTC m=+181.036397454" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.009471 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.009547 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.009565 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.009591 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.009610 4919 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T21:53:14Z","lastTransitionTime":"2026-03-10T21:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.067613 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb"] Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.068144 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.069817 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.069951 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.071673 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.071828 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.127734 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=60.127717692 podStartE2EDuration="1m0.127717692s" podCreationTimestamp="2026-03-10 21:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:14.110999509 +0000 UTC m=+181.352880157" watchObservedRunningTime="2026-03-10 21:53:14.127717692 +0000 UTC m=+181.369598300" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.143122 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podStartSLOduration=119.143104808 podStartE2EDuration="1m59.143104808s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:14.143043417 +0000 UTC m=+181.384924025" watchObservedRunningTime="2026-03-10 21:53:14.143104808 +0000 UTC m=+181.384985416" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.168193 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ddfab0-25ee-490a-b992-f61de946a502-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.168231 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f7ddfab0-25ee-490a-b992-f61de946a502-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.168251 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7ddfab0-25ee-490a-b992-f61de946a502-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.168381 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7ddfab0-25ee-490a-b992-f61de946a502-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.168445 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f7ddfab0-25ee-490a-b992-f61de946a502-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.175765 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hbw8v" podStartSLOduration=119.175748761 podStartE2EDuration="1m59.175748761s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:14.1605506 +0000 UTC m=+181.402431208" watchObservedRunningTime="2026-03-10 21:53:14.175748761 +0000 UTC m=+181.417629369" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.187176 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=89.187162 podStartE2EDuration="1m29.187162s" podCreationTimestamp="2026-03-10 21:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:14.187101778 +0000 UTC m=+181.428982426" watchObservedRunningTime="2026-03-10 21:53:14.187162 +0000 UTC m=+181.429042608" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.214371 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=84.214348636 podStartE2EDuration="1m24.214348636s" podCreationTimestamp="2026-03-10 21:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:14.21302227 +0000 UTC m=+181.454902878" watchObservedRunningTime="2026-03-10 21:53:14.214348636 +0000 UTC m=+181.456229264" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.243837 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=83.243819494 podStartE2EDuration="1m23.243819494s" podCreationTimestamp="2026-03-10 21:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:14.231943352 +0000 UTC m=+181.473823960" watchObservedRunningTime="2026-03-10 21:53:14.243819494 +0000 UTC m=+181.485700102" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.267815 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hzq7c" podStartSLOduration=119.267795592 podStartE2EDuration="1m59.267795592s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:14.267091923 +0000 UTC m=+181.508972551" watchObservedRunningTime="2026-03-10 21:53:14.267795592 +0000 UTC m=+181.509676210" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.269303 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ddfab0-25ee-490a-b992-f61de946a502-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.269346 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f7ddfab0-25ee-490a-b992-f61de946a502-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.269365 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7ddfab0-25ee-490a-b992-f61de946a502-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.269435 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7ddfab0-25ee-490a-b992-f61de946a502-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.269460 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f7ddfab0-25ee-490a-b992-f61de946a502-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.269468 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f7ddfab0-25ee-490a-b992-f61de946a502-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.269512 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f7ddfab0-25ee-490a-b992-f61de946a502-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.270522 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7ddfab0-25ee-490a-b992-f61de946a502-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.276057 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ddfab0-25ee-490a-b992-f61de946a502-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.292530 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7ddfab0-25ee-490a-b992-f61de946a502-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mdszb\" (UID: \"f7ddfab0-25ee-490a-b992-f61de946a502\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.382095 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.479717 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.479786 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.479716 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:14 crc kubenswrapper[4919]: E0310 21:53:14.479844 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:14 crc kubenswrapper[4919]: E0310 21:53:14.479887 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:14 crc kubenswrapper[4919]: E0310 21:53:14.480062 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.515852 4919 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 21:53:14 crc kubenswrapper[4919]: I0310 21:53:14.524519 4919 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 21:53:15 crc kubenswrapper[4919]: I0310 21:53:15.302974 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" event={"ID":"f7ddfab0-25ee-490a-b992-f61de946a502","Type":"ContainerStarted","Data":"d3b5bc7a59623ab6ed63c056ccf81f32955f922a4dac56f526fe34afe6fbea07"} Mar 10 21:53:15 crc kubenswrapper[4919]: I0310 21:53:15.303052 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" event={"ID":"f7ddfab0-25ee-490a-b992-f61de946a502","Type":"ContainerStarted","Data":"97c585b47dd603697e789d36ab59fae5f7342e433d699f0b054e5158d02fa4c7"} Mar 10 21:53:15 crc kubenswrapper[4919]: I0310 21:53:15.323007 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mdszb" podStartSLOduration=120.322990044 podStartE2EDuration="2m0.322990044s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:15.322746778 +0000 UTC m=+182.564627426" watchObservedRunningTime="2026-03-10 21:53:15.322990044 +0000 UTC m=+182.564870672" Mar 10 21:53:15 crc kubenswrapper[4919]: I0310 21:53:15.479774 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:15 crc kubenswrapper[4919]: E0310 21:53:15.480019 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:15 crc kubenswrapper[4919]: I0310 21:53:15.687533 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:15 crc kubenswrapper[4919]: E0310 21:53:15.687671 4919 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:53:15 crc kubenswrapper[4919]: E0310 21:53:15.687724 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs podName:a95e8b73-ffed-4248-b8ba-99fc7c5b900f nodeName:}" failed. No retries permitted until 2026-03-10 21:54:19.687711603 +0000 UTC m=+246.929592211 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs") pod "network-metrics-daemon-ckwhl" (UID: "a95e8b73-ffed-4248-b8ba-99fc7c5b900f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 21:53:16 crc kubenswrapper[4919]: I0310 21:53:16.479038 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:16 crc kubenswrapper[4919]: I0310 21:53:16.479065 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:16 crc kubenswrapper[4919]: I0310 21:53:16.479073 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:16 crc kubenswrapper[4919]: E0310 21:53:16.479215 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:16 crc kubenswrapper[4919]: E0310 21:53:16.479356 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:16 crc kubenswrapper[4919]: E0310 21:53:16.479494 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:17 crc kubenswrapper[4919]: I0310 21:53:17.479525 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:17 crc kubenswrapper[4919]: E0310 21:53:17.479680 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:18 crc kubenswrapper[4919]: I0310 21:53:18.479230 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:18 crc kubenswrapper[4919]: I0310 21:53:18.479266 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:18 crc kubenswrapper[4919]: I0310 21:53:18.479305 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:18 crc kubenswrapper[4919]: E0310 21:53:18.479450 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:18 crc kubenswrapper[4919]: E0310 21:53:18.479527 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:18 crc kubenswrapper[4919]: E0310 21:53:18.479805 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:18 crc kubenswrapper[4919]: E0310 21:53:18.596572 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:19 crc kubenswrapper[4919]: I0310 21:53:19.478963 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:19 crc kubenswrapper[4919]: E0310 21:53:19.479112 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:20 crc kubenswrapper[4919]: I0310 21:53:20.479843 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:20 crc kubenswrapper[4919]: I0310 21:53:20.479942 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:20 crc kubenswrapper[4919]: E0310 21:53:20.480028 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:20 crc kubenswrapper[4919]: E0310 21:53:20.480301 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:20 crc kubenswrapper[4919]: I0310 21:53:20.480636 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:20 crc kubenswrapper[4919]: E0310 21:53:20.480793 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:21 crc kubenswrapper[4919]: I0310 21:53:21.479359 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:21 crc kubenswrapper[4919]: E0310 21:53:21.479498 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:22 crc kubenswrapper[4919]: I0310 21:53:22.479173 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:22 crc kubenswrapper[4919]: I0310 21:53:22.479186 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:22 crc kubenswrapper[4919]: E0310 21:53:22.479533 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:22 crc kubenswrapper[4919]: E0310 21:53:22.479870 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:22 crc kubenswrapper[4919]: I0310 21:53:22.481136 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:22 crc kubenswrapper[4919]: E0310 21:53:22.481320 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:23 crc kubenswrapper[4919]: I0310 21:53:23.479552 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:23 crc kubenswrapper[4919]: E0310 21:53:23.481722 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:23 crc kubenswrapper[4919]: E0310 21:53:23.597349 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:24 crc kubenswrapper[4919]: I0310 21:53:24.479251 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:24 crc kubenswrapper[4919]: E0310 21:53:24.479452 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:24 crc kubenswrapper[4919]: I0310 21:53:24.479520 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:24 crc kubenswrapper[4919]: I0310 21:53:24.479650 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:24 crc kubenswrapper[4919]: E0310 21:53:24.479706 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:24 crc kubenswrapper[4919]: E0310 21:53:24.479830 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:25 crc kubenswrapper[4919]: I0310 21:53:25.479323 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:25 crc kubenswrapper[4919]: E0310 21:53:25.480652 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:25 crc kubenswrapper[4919]: I0310 21:53:25.481174 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 21:53:25 crc kubenswrapper[4919]: E0310 21:53:25.481558 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4dp67_openshift-ovn-kubernetes(a2e7c6fb-9e33-441d-9197-719929eb9e21)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" Mar 10 21:53:26 crc kubenswrapper[4919]: I0310 21:53:26.479818 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:26 crc kubenswrapper[4919]: I0310 21:53:26.479830 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:26 crc kubenswrapper[4919]: E0310 21:53:26.479958 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:26 crc kubenswrapper[4919]: I0310 21:53:26.480028 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:26 crc kubenswrapper[4919]: E0310 21:53:26.480284 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:26 crc kubenswrapper[4919]: E0310 21:53:26.480366 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:27 crc kubenswrapper[4919]: I0310 21:53:27.479680 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:27 crc kubenswrapper[4919]: E0310 21:53:27.479877 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:28 crc kubenswrapper[4919]: I0310 21:53:28.479978 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:28 crc kubenswrapper[4919]: I0310 21:53:28.480044 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:28 crc kubenswrapper[4919]: E0310 21:53:28.480215 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:28 crc kubenswrapper[4919]: I0310 21:53:28.480252 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:28 crc kubenswrapper[4919]: E0310 21:53:28.480427 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:28 crc kubenswrapper[4919]: E0310 21:53:28.480561 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:28 crc kubenswrapper[4919]: E0310 21:53:28.599522 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:29 crc kubenswrapper[4919]: I0310 21:53:29.479363 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:29 crc kubenswrapper[4919]: E0310 21:53:29.479927 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:30 crc kubenswrapper[4919]: I0310 21:53:30.479743 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:30 crc kubenswrapper[4919]: I0310 21:53:30.479816 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:30 crc kubenswrapper[4919]: I0310 21:53:30.479844 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:30 crc kubenswrapper[4919]: E0310 21:53:30.479978 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:30 crc kubenswrapper[4919]: E0310 21:53:30.480083 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:30 crc kubenswrapper[4919]: E0310 21:53:30.480477 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:31 crc kubenswrapper[4919]: I0310 21:53:31.478916 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:31 crc kubenswrapper[4919]: E0310 21:53:31.479136 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.378081 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/1.log" Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.378837 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/0.log" Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.378906 4919 generic.go:334] "Generic (PLEG): container finished" podID="6a5db7c3-2a96-4030-8c88-5d82d325b62d" containerID="0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc" exitCode=1 Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.379002 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbw8v" event={"ID":"6a5db7c3-2a96-4030-8c88-5d82d325b62d","Type":"ContainerDied","Data":"0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc"} Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.379135 4919 scope.go:117] "RemoveContainer" containerID="abe6d0aa7236ecb1ecf10432a82e6fd0b3103606dbf07a21f54a1908c77ef697" Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.379721 4919 scope.go:117] "RemoveContainer" containerID="0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc" Mar 10 21:53:32 crc kubenswrapper[4919]: E0310 21:53:32.379939 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-hbw8v_openshift-multus(6a5db7c3-2a96-4030-8c88-5d82d325b62d)\"" pod="openshift-multus/multus-hbw8v" podUID="6a5db7c3-2a96-4030-8c88-5d82d325b62d" Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.478911 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.478933 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:32 crc kubenswrapper[4919]: E0310 21:53:32.479052 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:32 crc kubenswrapper[4919]: I0310 21:53:32.478912 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:32 crc kubenswrapper[4919]: E0310 21:53:32.479275 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:32 crc kubenswrapper[4919]: E0310 21:53:32.479370 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:33 crc kubenswrapper[4919]: I0310 21:53:33.383998 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/1.log" Mar 10 21:53:33 crc kubenswrapper[4919]: I0310 21:53:33.478969 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:33 crc kubenswrapper[4919]: E0310 21:53:33.481601 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:33 crc kubenswrapper[4919]: E0310 21:53:33.600518 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:34 crc kubenswrapper[4919]: I0310 21:53:34.479310 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:34 crc kubenswrapper[4919]: I0310 21:53:34.479422 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:34 crc kubenswrapper[4919]: E0310 21:53:34.479482 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:34 crc kubenswrapper[4919]: I0310 21:53:34.479343 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:34 crc kubenswrapper[4919]: E0310 21:53:34.479701 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:34 crc kubenswrapper[4919]: E0310 21:53:34.479817 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:35 crc kubenswrapper[4919]: I0310 21:53:35.479308 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:35 crc kubenswrapper[4919]: E0310 21:53:35.479471 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:36 crc kubenswrapper[4919]: I0310 21:53:36.479271 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:36 crc kubenswrapper[4919]: I0310 21:53:36.479292 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:36 crc kubenswrapper[4919]: E0310 21:53:36.479425 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:36 crc kubenswrapper[4919]: I0310 21:53:36.479512 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:36 crc kubenswrapper[4919]: E0310 21:53:36.479625 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:36 crc kubenswrapper[4919]: E0310 21:53:36.479722 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:37 crc kubenswrapper[4919]: I0310 21:53:37.479361 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:37 crc kubenswrapper[4919]: E0310 21:53:37.479575 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:38 crc kubenswrapper[4919]: I0310 21:53:38.480018 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:38 crc kubenswrapper[4919]: I0310 21:53:38.480026 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:38 crc kubenswrapper[4919]: I0310 21:53:38.480143 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:38 crc kubenswrapper[4919]: E0310 21:53:38.480887 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:38 crc kubenswrapper[4919]: E0310 21:53:38.481212 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:38 crc kubenswrapper[4919]: E0310 21:53:38.481360 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:38 crc kubenswrapper[4919]: E0310 21:53:38.601737 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:39 crc kubenswrapper[4919]: I0310 21:53:39.479498 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:39 crc kubenswrapper[4919]: E0310 21:53:39.479688 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:40 crc kubenswrapper[4919]: I0310 21:53:40.479996 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:40 crc kubenswrapper[4919]: I0310 21:53:40.480136 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:40 crc kubenswrapper[4919]: E0310 21:53:40.480791 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:40 crc kubenswrapper[4919]: I0310 21:53:40.480191 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:40 crc kubenswrapper[4919]: I0310 21:53:40.481254 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 21:53:40 crc kubenswrapper[4919]: E0310 21:53:40.481282 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:40 crc kubenswrapper[4919]: E0310 21:53:40.481171 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:41 crc kubenswrapper[4919]: I0310 21:53:41.415587 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/3.log" Mar 10 21:53:41 crc kubenswrapper[4919]: I0310 21:53:41.418903 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerStarted","Data":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} Mar 10 21:53:41 crc kubenswrapper[4919]: I0310 21:53:41.419321 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:53:41 crc kubenswrapper[4919]: I0310 21:53:41.463845 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podStartSLOduration=146.463827631 podStartE2EDuration="2m26.463827631s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:41.459282957 +0000 UTC m=+208.701163575" watchObservedRunningTime="2026-03-10 21:53:41.463827631 +0000 UTC m=+208.705708239" Mar 10 21:53:41 crc kubenswrapper[4919]: I0310 21:53:41.467646 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ckwhl"] Mar 10 21:53:41 crc kubenswrapper[4919]: I0310 21:53:41.467778 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:41 crc kubenswrapper[4919]: E0310 21:53:41.467883 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:42 crc kubenswrapper[4919]: I0310 21:53:42.479552 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:42 crc kubenswrapper[4919]: I0310 21:53:42.479609 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:42 crc kubenswrapper[4919]: E0310 21:53:42.479723 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:42 crc kubenswrapper[4919]: I0310 21:53:42.479554 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:42 crc kubenswrapper[4919]: E0310 21:53:42.479917 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:42 crc kubenswrapper[4919]: E0310 21:53:42.480022 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:43 crc kubenswrapper[4919]: I0310 21:53:43.479617 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:43 crc kubenswrapper[4919]: E0310 21:53:43.481690 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:43 crc kubenswrapper[4919]: E0310 21:53:43.602340 4919 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 21:53:44 crc kubenswrapper[4919]: I0310 21:53:44.479968 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:44 crc kubenswrapper[4919]: I0310 21:53:44.480043 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:44 crc kubenswrapper[4919]: I0310 21:53:44.481043 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:44 crc kubenswrapper[4919]: E0310 21:53:44.481251 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:44 crc kubenswrapper[4919]: E0310 21:53:44.481478 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:44 crc kubenswrapper[4919]: E0310 21:53:44.481459 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:45 crc kubenswrapper[4919]: I0310 21:53:45.479664 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:45 crc kubenswrapper[4919]: E0310 21:53:45.481255 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.474663 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.474880 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:55:48.474838078 +0000 UTC m=+335.716718736 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.475651 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.475932 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.475986 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476016 4919 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476099 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 21:55:48.476073401 +0000 UTC m=+335.717954059 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476283 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476336 4919 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476370 4919 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476505 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 21:55:48.476477783 +0000 UTC m=+335.718358431 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.476519 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.476692 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.476759 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476809 4919 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476866 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:55:48.476852473 +0000 UTC m=+335.718733091 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.476961 4919 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.477063 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 21:55:48.477032138 +0000 UTC m=+335.718912916 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.479698 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.479828 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.479988 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.480005 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.480187 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:46 crc kubenswrapper[4919]: E0310 21:53:46.480353 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:46 crc kubenswrapper[4919]: I0310 21:53:46.480517 4919 scope.go:117] "RemoveContainer" containerID="0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc" Mar 10 21:53:47 crc kubenswrapper[4919]: I0310 21:53:47.443107 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/1.log" Mar 10 21:53:47 crc kubenswrapper[4919]: I0310 21:53:47.443193 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbw8v" event={"ID":"6a5db7c3-2a96-4030-8c88-5d82d325b62d","Type":"ContainerStarted","Data":"45570c9a8a9f7a51b2de68cee5cd8f8ae4fc089c9db6203a5d0b78f77094f15a"} Mar 10 21:53:47 crc kubenswrapper[4919]: I0310 21:53:47.479252 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:47 crc kubenswrapper[4919]: E0310 21:53:47.479569 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckwhl" podUID="a95e8b73-ffed-4248-b8ba-99fc7c5b900f" Mar 10 21:53:48 crc kubenswrapper[4919]: I0310 21:53:48.479214 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:48 crc kubenswrapper[4919]: I0310 21:53:48.479268 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:48 crc kubenswrapper[4919]: E0310 21:53:48.479591 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:53:48 crc kubenswrapper[4919]: I0310 21:53:48.479619 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:48 crc kubenswrapper[4919]: E0310 21:53:48.479831 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 21:53:48 crc kubenswrapper[4919]: E0310 21:53:48.479926 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 21:53:49 crc kubenswrapper[4919]: I0310 21:53:49.479960 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:53:49 crc kubenswrapper[4919]: I0310 21:53:49.488301 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 21:53:49 crc kubenswrapper[4919]: I0310 21:53:49.488447 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 21:53:50 crc kubenswrapper[4919]: I0310 21:53:50.479889 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:53:50 crc kubenswrapper[4919]: I0310 21:53:50.479927 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:53:50 crc kubenswrapper[4919]: I0310 21:53:50.480020 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:53:50 crc kubenswrapper[4919]: I0310 21:53:50.482999 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 21:53:50 crc kubenswrapper[4919]: I0310 21:53:50.483074 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 21:53:50 crc kubenswrapper[4919]: I0310 21:53:50.483272 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 21:53:50 crc kubenswrapper[4919]: I0310 21:53:50.485783 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.916251 4919 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.970335 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zbds9"] Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.971190 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.972946 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2dzqk"] Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.973751 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.975528 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6"] Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.975862 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.976274 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.977225 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.978772 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.987023 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.988354 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.988799 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.996471 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx"] Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.996713 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.997819 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:54 crc kubenswrapper[4919]: I0310 21:53:54.999851 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.000537 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.000567 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.000723 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.000815 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.000940 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.001111 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.001290 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.001383 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.001500 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.001671 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.001687 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.002057 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.003642 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.004640 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.009386 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.009381 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w5wb5"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.010297 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.013466 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mvwrl"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.014291 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.018531 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.019034 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.019254 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.019357 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.019513 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.019873 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.031214 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.033758 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h9zcs"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.035829 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fb6tt"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.039671 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.040313 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.040448 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.040566 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.040608 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.043758 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.043800 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.056570 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.056835 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.057027 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.057846 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.058446 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.059659 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.059884 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.060115 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.060321 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.060567 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.060973 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.062780 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.062941 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.063059 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.063112 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.063175 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.063057 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.063822 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6nwch"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.064368 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.064686 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.065448 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.066054 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.069417 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.069645 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.071889 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckbrj\" (UniqueName: \"kubernetes.io/projected/d809a4c1-5e06-46b7-a39c-466b694361ce-kube-api-access-ckbrj\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.071945 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-config\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.071972 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-config\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.071997 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf77df7-a88b-49cb-9d1f-10cdac32f499-serving-cert\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072042 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbnh\" (UniqueName: \"kubernetes.io/projected/492e2f21-90e0-4073-b4ad-b562bcf62486-kube-api-access-thbnh\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072065 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d809a4c1-5e06-46b7-a39c-466b694361ce-serving-cert\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072085 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072105 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/492e2f21-90e0-4073-b4ad-b562bcf62486-serving-cert\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072116 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072131 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-client-ca\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072153 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-config\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072173 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-client-ca\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072202 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmggh\" (UniqueName: \"kubernetes.io/projected/bbf77df7-a88b-49cb-9d1f-10cdac32f499-kube-api-access-pmggh\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072249 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072272 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-service-ca-bundle\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072556 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072655 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.072806 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.074339 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-49z6f"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.075097 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-49z6f" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.075176 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m9qd4"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.075616 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.077813 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.078356 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.086379 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.086769 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.086989 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.087048 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.087093 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.087178 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.087244 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.087280 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.087372 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.087458 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.087528 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.088052 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.088077 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.088415 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.088441 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.089202 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-58nxf"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.089518 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.090415 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjw8s"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.090501 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.091198 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.091267 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.091402 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.091617 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.093698 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.093713 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.093733 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.093824 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.093860 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.093986 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094038 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.093988 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094124 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094176 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094212 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094237 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094237 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094299 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094459 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.094874 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.095163 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.101344 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.102054 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.102572 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zbds9"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.106830 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2dzqk"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.116919 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.117786 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.119379 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.127800 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.127935 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.128162 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.128504 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.128701 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.131563 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.128509 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.151525 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.151868 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.151927 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.152189 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.153551 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hmjhm"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.153782 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.153987 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.154027 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.155372 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.161677 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lxrsj"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.161898 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.162101 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.162247 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.162764 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.162905 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.163046 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.163382 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.163417 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.163464 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.163608 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.163931 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.164348 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.165491 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tk7xs"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.166064 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.166377 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.166948 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172194 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172724 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-service-ca-bundle\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172750 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-etcd-serving-ca\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172770 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172777 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-etcd-client\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172794 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172810 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-config\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172831 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172849 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-oauth-serving-cert\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172866 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh86p\" (UniqueName: \"kubernetes.io/projected/077de36b-affe-4c2b-905d-38ae64514274-kube-api-access-rh86p\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172881 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172897 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-console-config\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172913 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckbrj\" (UniqueName: \"kubernetes.io/projected/d809a4c1-5e06-46b7-a39c-466b694361ce-kube-api-access-ckbrj\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172928 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e327c09-06d0-420c-b749-50306c5336b3-config\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172947 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.172961 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtpc6\" (UniqueName: \"kubernetes.io/projected/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-kube-api-access-dtpc6\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173008 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg7bw\" (UniqueName: \"kubernetes.io/projected/fb7623ea-ec67-4061-82a6-4099e52fa3b9-kube-api-access-dg7bw\") pod \"downloads-7954f5f757-49z6f\" (UID: \"fb7623ea-ec67-4061-82a6-4099e52fa3b9\") " pod="openshift-console/downloads-7954f5f757-49z6f" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173023 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8thhg\" (UniqueName: \"kubernetes.io/projected/9657873d-9275-4945-9e91-0b2c2844ae5d-kube-api-access-8thhg\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173074 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brnxv\" (UniqueName: \"kubernetes.io/projected/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-kube-api-access-brnxv\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173109 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2c5effa-6a05-4978-b4a9-1daada6b2465-audit-dir\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173129 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8b972b3-02f6-4c31-bb8e-0c229ea48621-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zchq4\" (UID: \"d8b972b3-02f6-4c31-bb8e-0c229ea48621\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173168 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzm74\" (UniqueName: \"kubernetes.io/projected/5e327c09-06d0-420c-b749-50306c5336b3-kube-api-access-xzm74\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173191 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173225 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-config\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173262 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173287 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-serving-cert\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173318 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-config\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173349 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173364 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c95v2\" (UniqueName: \"kubernetes.io/projected/b2c5effa-6a05-4978-b4a9-1daada6b2465-kube-api-access-c95v2\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173382 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-trusted-ca-bundle\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173417 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-config\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173434 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173434 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173451 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-images\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173569 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173599 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173641 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/19dc609c-7c35-46d2-b621-34faa138eedd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173662 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-audit-policies\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173683 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173733 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf77df7-a88b-49cb-9d1f-10cdac32f499-serving-cert\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173776 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w2zr\" (UniqueName: \"kubernetes.io/projected/23546924-f010-4ffb-8e0a-cde77e6b086f-kube-api-access-4w2zr\") pod \"dns-operator-744455d44c-w5wb5\" (UID: \"23546924-f010-4ffb-8e0a-cde77e6b086f\") " pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.173791 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mbw4\" (UniqueName: \"kubernetes.io/projected/b00c04d4-1287-409a-8e67-2edb888bf832-kube-api-access-9mbw4\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174130 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk858\" (UniqueName: \"kubernetes.io/projected/d8b972b3-02f6-4c31-bb8e-0c229ea48621-kube-api-access-gk858\") pod \"cluster-samples-operator-665b6dd947-zchq4\" (UID: \"d8b972b3-02f6-4c31-bb8e-0c229ea48621\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174169 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thbnh\" (UniqueName: \"kubernetes.io/projected/492e2f21-90e0-4073-b4ad-b562bcf62486-kube-api-access-thbnh\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174209 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-serving-cert\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174226 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-serving-cert\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174237 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-config\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174241 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174277 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-ca\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174317 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d809a4c1-5e06-46b7-a39c-466b694361ce-serving-cert\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174335 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/23546924-f010-4ffb-8e0a-cde77e6b086f-metrics-tls\") pod \"dns-operator-744455d44c-w5wb5\" (UID: \"23546924-f010-4ffb-8e0a-cde77e6b086f\") " pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174354 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077de36b-affe-4c2b-905d-38ae64514274-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174365 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174405 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174425 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-config\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174446 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19dc609c-7c35-46d2-b621-34faa138eedd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174596 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-audit-policies\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174615 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-client\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174635 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/492e2f21-90e0-4073-b4ad-b562bcf62486-serving-cert\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174669 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174688 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174706 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-encryption-config\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174725 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-client-ca\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174755 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-config\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174772 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-client-ca\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174789 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b2c5effa-6a05-4978-b4a9-1daada6b2465-node-pullsecrets\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174806 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfbc7\" (UniqueName: \"kubernetes.io/projected/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-kube-api-access-pfbc7\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174821 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174836 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-etcd-client\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174852 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-serving-cert\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.174865 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2nvl\" (UniqueName: \"kubernetes.io/projected/b5a5c848-4844-4280-a444-9173fff0b8e1-kube-api-access-m2nvl\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175260 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dwg72"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175333 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175481 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175491 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-client-ca\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175586 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175605 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/077de36b-affe-4c2b-905d-38ae64514274-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175649 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs7pm\" (UniqueName: \"kubernetes.io/projected/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-kube-api-access-bs7pm\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175691 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmggh\" (UniqueName: \"kubernetes.io/projected/bbf77df7-a88b-49cb-9d1f-10cdac32f499-kube-api-access-pmggh\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175738 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-image-import-ca\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175760 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19dc609c-7c35-46d2-b621-34faa138eedd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175794 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggmjx\" (UniqueName: \"kubernetes.io/projected/19dc609c-7c35-46d2-b621-34faa138eedd-kube-api-access-ggmjx\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175816 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-oauth-config\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175838 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-service-ca\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175860 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-machine-approver-tls\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175902 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b00c04d4-1287-409a-8e67-2edb888bf832-audit-dir\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175923 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-service-ca\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175963 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-serving-cert\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.175982 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-config\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176103 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-audit\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176129 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e327c09-06d0-420c-b749-50306c5336b3-serving-cert\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176162 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-auth-proxy-config\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176185 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176224 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5a5c848-4844-4280-a444-9173fff0b8e1-audit-dir\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176246 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-encryption-config\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176266 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176307 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176347 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e327c09-06d0-420c-b749-50306c5336b3-trusted-ca\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176404 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-client-ca\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176504 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6nwch"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176606 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.176688 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-config\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.177296 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-config\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.177341 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-service-ca-bundle\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.177356 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-49z6f"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.177787 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf77df7-a88b-49cb-9d1f-10cdac32f499-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.178146 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.179208 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjw8s"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.180125 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf77df7-a88b-49cb-9d1f-10cdac32f499-serving-cert\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.180180 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.180709 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.181078 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.181292 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d809a4c1-5e06-46b7-a39c-466b694361ce-serving-cert\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.181654 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.181943 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.182665 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.182827 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/492e2f21-90e0-4073-b4ad-b562bcf62486-serving-cert\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.182861 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-htbkb"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.183221 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.183902 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-26f88"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.184444 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-26f88" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.184905 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-74h67"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.185298 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.185899 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.186584 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.190372 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552992-kp6wz"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.190846 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.194048 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.195024 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552992-kp6wz" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.196542 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.196742 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.199907 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h9zcs"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.202210 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m9qd4"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.203354 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.205624 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tk7xs"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.206808 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.210602 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lxrsj"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.211805 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.212990 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w5wb5"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.213344 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.214110 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.215162 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fb6tt"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.216242 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mvwrl"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.217271 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.218362 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-htbkb"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.219314 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hfqfv"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.219994 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hfqfv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.220677 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nv6qp"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.221662 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.221766 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.222815 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.223832 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dwg72"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.224859 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.225839 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-58nxf"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.226840 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.227821 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.229060 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.230097 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-26f88"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.231124 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.232123 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.233517 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hfqfv"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.234627 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-74h67"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.235903 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.237011 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552992-kp6wz"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.238188 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.238230 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.239167 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nv6qp"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.240155 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hkjkr"] Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.240777 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.253715 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.273273 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.276933 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w2zr\" (UniqueName: \"kubernetes.io/projected/23546924-f010-4ffb-8e0a-cde77e6b086f-kube-api-access-4w2zr\") pod \"dns-operator-744455d44c-w5wb5\" (UID: \"23546924-f010-4ffb-8e0a-cde77e6b086f\") " pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277021 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mbw4\" (UniqueName: \"kubernetes.io/projected/b00c04d4-1287-409a-8e67-2edb888bf832-kube-api-access-9mbw4\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277077 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsbkv\" (UniqueName: \"kubernetes.io/projected/32d0aafd-3bc5-4173-86df-ce624028b1a2-kube-api-access-dsbkv\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277107 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk858\" (UniqueName: \"kubernetes.io/projected/d8b972b3-02f6-4c31-bb8e-0c229ea48621-kube-api-access-gk858\") pod \"cluster-samples-operator-665b6dd947-zchq4\" (UID: \"d8b972b3-02f6-4c31-bb8e-0c229ea48621\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277160 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-serving-cert\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277191 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-serving-cert\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277219 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277248 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-ca\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277271 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/23546924-f010-4ffb-8e0a-cde77e6b086f-metrics-tls\") pod \"dns-operator-744455d44c-w5wb5\" (UID: \"23546924-f010-4ffb-8e0a-cde77e6b086f\") " pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277296 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077de36b-affe-4c2b-905d-38ae64514274-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277319 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19dc609c-7c35-46d2-b621-34faa138eedd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277341 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-audit-policies\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277362 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-client\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277403 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-config\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277429 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277456 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277543 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-encryption-config\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277568 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b2c5effa-6a05-4978-b4a9-1daada6b2465-node-pullsecrets\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277593 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfbc7\" (UniqueName: \"kubernetes.io/projected/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-kube-api-access-pfbc7\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277616 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277672 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-etcd-client\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277698 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-proxy-tls\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277722 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-cabundle\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277748 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-serving-cert\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277772 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2nvl\" (UniqueName: \"kubernetes.io/projected/b5a5c848-4844-4280-a444-9173fff0b8e1-kube-api-access-m2nvl\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277796 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc24k\" (UniqueName: \"kubernetes.io/projected/99f42fb7-eaa5-46d2-9443-81ad7a563cec-kube-api-access-kc24k\") pod \"auto-csr-approver-29552992-kp6wz\" (UID: \"99f42fb7-eaa5-46d2-9443-81ad7a563cec\") " pod="openshift-infra/auto-csr-approver-29552992-kp6wz" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277819 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/077de36b-affe-4c2b-905d-38ae64514274-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277841 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs7pm\" (UniqueName: \"kubernetes.io/projected/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-kube-api-access-bs7pm\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277866 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277888 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-srv-cert\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277913 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggmjx\" (UniqueName: \"kubernetes.io/projected/19dc609c-7c35-46d2-b621-34faa138eedd-kube-api-access-ggmjx\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277936 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-oauth-config\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277958 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-service-ca\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.277980 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-machine-approver-tls\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278008 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-image-import-ca\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278033 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19dc609c-7c35-46d2-b621-34faa138eedd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278054 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b00c04d4-1287-409a-8e67-2edb888bf832-audit-dir\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278082 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-service-ca\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278106 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278133 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278157 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278181 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58cpc\" (UniqueName: \"kubernetes.io/projected/55ff2223-69b6-4b72-9413-fce0c37ae2b2-kube-api-access-58cpc\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278225 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-serving-cert\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278251 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-config\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278275 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd55m\" (UniqueName: \"kubernetes.io/projected/c097294b-dec5-400e-ba02-ccee86fcdb90-kube-api-access-nd55m\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278297 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-key\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278324 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e327c09-06d0-420c-b749-50306c5336b3-serving-cert\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278342 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-config\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278348 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-audit\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278421 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c097294b-dec5-400e-ba02-ccee86fcdb90-images\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278456 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-auth-proxy-config\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278477 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278500 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5a5c848-4844-4280-a444-9173fff0b8e1-audit-dir\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278518 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278541 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-encryption-config\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278558 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278575 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-srv-cert\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278592 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrrpd\" (UniqueName: \"kubernetes.io/projected/be41b09e-a8ff-4367-a68d-865f047e2549-kube-api-access-nrrpd\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278616 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e327c09-06d0-420c-b749-50306c5336b3-trusted-ca\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278653 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-etcd-serving-ca\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278672 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5c7\" (UniqueName: \"kubernetes.io/projected/ca9a516e-afc2-4475-8af8-23504d17f9a9-kube-api-access-4p5c7\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278691 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-etcd-client\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278712 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278730 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-config\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278758 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c097294b-dec5-400e-ba02-ccee86fcdb90-proxy-tls\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278777 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278796 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-profile-collector-cert\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278824 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-oauth-serving-cert\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278847 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-audit-policies\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278860 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh86p\" (UniqueName: \"kubernetes.io/projected/077de36b-affe-4c2b-905d-38ae64514274-kube-api-access-rh86p\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278927 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278959 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-console-config\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.278996 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njmn2\" (UniqueName: \"kubernetes.io/projected/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-kube-api-access-njmn2\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279027 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca9a516e-afc2-4475-8af8-23504d17f9a9-service-ca-bundle\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279055 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279090 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e327c09-06d0-420c-b749-50306c5336b3-config\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279113 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279136 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtpc6\" (UniqueName: \"kubernetes.io/projected/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-kube-api-access-dtpc6\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279160 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-metrics-certs\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279187 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg7bw\" (UniqueName: \"kubernetes.io/projected/fb7623ea-ec67-4061-82a6-4099e52fa3b9-kube-api-access-dg7bw\") pod \"downloads-7954f5f757-49z6f\" (UID: \"fb7623ea-ec67-4061-82a6-4099e52fa3b9\") " pod="openshift-console/downloads-7954f5f757-49z6f" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279211 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8thhg\" (UniqueName: \"kubernetes.io/projected/9657873d-9275-4945-9e91-0b2c2844ae5d-kube-api-access-8thhg\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279237 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brnxv\" (UniqueName: \"kubernetes.io/projected/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-kube-api-access-brnxv\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279290 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2c5effa-6a05-4978-b4a9-1daada6b2465-audit-dir\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279316 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8b972b3-02f6-4c31-bb8e-0c229ea48621-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zchq4\" (UID: \"d8b972b3-02f6-4c31-bb8e-0c229ea48621\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279343 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kzz4\" (UniqueName: \"kubernetes.io/projected/6ace15bc-71c5-45e7-8791-5d59045c73b9-kube-api-access-7kzz4\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279368 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279409 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c097294b-dec5-400e-ba02-ccee86fcdb90-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279432 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-stats-auth\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279459 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzm74\" (UniqueName: \"kubernetes.io/projected/5e327c09-06d0-420c-b749-50306c5336b3-kube-api-access-xzm74\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279485 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-serving-cert\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279509 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2jfl\" (UniqueName: \"kubernetes.io/projected/0aa7f443-95c5-43ae-bd23-f4204d5f8778-kube-api-access-n2jfl\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279517 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-auth-proxy-config\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279535 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279564 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279551 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279589 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c95v2\" (UniqueName: \"kubernetes.io/projected/b2c5effa-6a05-4978-b4a9-1daada6b2465-kube-api-access-c95v2\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279670 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-trusted-ca-bundle\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279712 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-config\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279776 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279855 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-images\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279934 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-default-certificate\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279966 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279990 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/19dc609c-7c35-46d2-b621-34faa138eedd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.280012 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-audit-policies\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.280034 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.281223 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b2c5effa-6a05-4978-b4a9-1daada6b2465-node-pullsecrets\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.281414 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-etcd-serving-ca\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.281453 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-ca\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.281484 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5a5c848-4844-4280-a444-9173fff0b8e1-audit-dir\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.281718 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-serving-cert\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.282002 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-serving-cert\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.282320 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-client\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.282435 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-images\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.282611 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/23546924-f010-4ffb-8e0a-cde77e6b086f-metrics-tls\") pod \"dns-operator-744455d44c-w5wb5\" (UID: \"23546924-f010-4ffb-8e0a-cde77e6b086f\") " pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.282679 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b00c04d4-1287-409a-8e67-2edb888bf832-audit-dir\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.282817 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-encryption-config\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.282950 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.283042 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/077de36b-affe-4c2b-905d-38ae64514274-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.283374 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.283606 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.279031 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-audit\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.283607 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-etcd-service-ca\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.283940 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.284658 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.284732 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.284853 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-console-config\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.285328 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-etcd-client\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.285374 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.285416 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-audit-policies\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.285474 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2c5effa-6a05-4978-b4a9-1daada6b2465-audit-dir\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.286089 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077de36b-affe-4c2b-905d-38ae64514274-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.286430 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e327c09-06d0-420c-b749-50306c5336b3-config\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.286534 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-config\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.286694 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-oauth-config\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.286986 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5a5c848-4844-4280-a444-9173fff0b8e1-etcd-client\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.287081 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.287092 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-oauth-serving-cert\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.287210 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-encryption-config\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.287223 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.287815 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-machine-approver-tls\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.288309 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5a5c848-4844-4280-a444-9173fff0b8e1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.288526 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19dc609c-7c35-46d2-b621-34faa138eedd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.288542 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-service-ca\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.289021 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.289181 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.289446 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-image-import-ca\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.289682 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/19dc609c-7c35-46d2-b621-34faa138eedd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.289858 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2c5effa-6a05-4978-b4a9-1daada6b2465-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.289979 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-config\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.290114 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e327c09-06d0-420c-b749-50306c5336b3-serving-cert\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.290234 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e327c09-06d0-420c-b749-50306c5336b3-trusted-ca\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.290278 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-config\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.290433 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8b972b3-02f6-4c31-bb8e-0c229ea48621-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zchq4\" (UID: \"d8b972b3-02f6-4c31-bb8e-0c229ea48621\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.294829 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.296688 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-serving-cert\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.296950 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.297357 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c5effa-6a05-4978-b4a9-1daada6b2465-serving-cert\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.297761 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-serving-cert\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.298630 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-trusted-ca-bundle\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.299840 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.314702 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.333216 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.353879 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.380937 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.380972 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-srv-cert\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381017 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381044 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381065 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381089 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58cpc\" (UniqueName: \"kubernetes.io/projected/55ff2223-69b6-4b72-9413-fce0c37ae2b2-kube-api-access-58cpc\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381148 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd55m\" (UniqueName: \"kubernetes.io/projected/c097294b-dec5-400e-ba02-ccee86fcdb90-kube-api-access-nd55m\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381177 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-key\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381204 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c097294b-dec5-400e-ba02-ccee86fcdb90-images\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381232 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381291 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-srv-cert\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381315 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrrpd\" (UniqueName: \"kubernetes.io/projected/be41b09e-a8ff-4367-a68d-865f047e2549-kube-api-access-nrrpd\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381342 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5c7\" (UniqueName: \"kubernetes.io/projected/ca9a516e-afc2-4475-8af8-23504d17f9a9-kube-api-access-4p5c7\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381401 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c097294b-dec5-400e-ba02-ccee86fcdb90-proxy-tls\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381439 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-profile-collector-cert\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381485 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njmn2\" (UniqueName: \"kubernetes.io/projected/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-kube-api-access-njmn2\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381517 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca9a516e-afc2-4475-8af8-23504d17f9a9-service-ca-bundle\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381540 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381579 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-metrics-certs\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381631 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kzz4\" (UniqueName: \"kubernetes.io/projected/6ace15bc-71c5-45e7-8791-5d59045c73b9-kube-api-access-7kzz4\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381671 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c097294b-dec5-400e-ba02-ccee86fcdb90-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381696 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-stats-auth\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381719 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2jfl\" (UniqueName: \"kubernetes.io/projected/0aa7f443-95c5-43ae-bd23-f4204d5f8778-kube-api-access-n2jfl\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381751 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-default-certificate\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381792 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsbkv\" (UniqueName: \"kubernetes.io/projected/32d0aafd-3bc5-4173-86df-ce624028b1a2-kube-api-access-dsbkv\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381868 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-proxy-tls\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381905 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc24k\" (UniqueName: \"kubernetes.io/projected/99f42fb7-eaa5-46d2-9443-81ad7a563cec-kube-api-access-kc24k\") pod \"auto-csr-approver-29552992-kp6wz\" (UID: \"99f42fb7-eaa5-46d2-9443-81ad7a563cec\") " pod="openshift-infra/auto-csr-approver-29552992-kp6wz" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.381929 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-cabundle\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.382176 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.382289 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c097294b-dec5-400e-ba02-ccee86fcdb90-auth-proxy-config\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.393617 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.413495 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.426290 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-default-certificate\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.434041 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.445920 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-stats-auth\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.453609 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.464071 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca9a516e-afc2-4475-8af8-23504d17f9a9-metrics-certs\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.474282 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.494497 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.502258 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca9a516e-afc2-4475-8af8-23504d17f9a9-service-ca-bundle\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.513416 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.533410 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.553654 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.574000 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.594968 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.613790 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.633759 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.654015 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.675356 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.694667 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.717240 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.733686 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.754095 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.774100 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.794771 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.814141 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.834191 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.854425 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.884484 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.893758 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.914113 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.933429 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.953479 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.973641 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 21:53:55 crc kubenswrapper[4919]: I0310 21:53:55.993966 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.005214 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.014077 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.045039 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.054708 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.056718 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.074018 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.094360 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.114523 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.134963 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.154198 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.174858 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.191687 4919 request.go:700] Waited for 1.018531546s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.221853 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckbrj\" (UniqueName: \"kubernetes.io/projected/d809a4c1-5e06-46b7-a39c-466b694361ce-kube-api-access-ckbrj\") pod \"controller-manager-879f6c89f-zbds9\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.233557 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.236555 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thbnh\" (UniqueName: \"kubernetes.io/projected/492e2f21-90e0-4073-b4ad-b562bcf62486-kube-api-access-thbnh\") pod \"route-controller-manager-6576b87f9c-ls7k6\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.254494 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.274653 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.282862 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c097294b-dec5-400e-ba02-ccee86fcdb90-images\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.286885 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.314451 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.327554 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmggh\" (UniqueName: \"kubernetes.io/projected/bbf77df7-a88b-49cb-9d1f-10cdac32f499-kube-api-access-pmggh\") pod \"authentication-operator-69f744f599-2dzqk\" (UID: \"bbf77df7-a88b-49cb-9d1f-10cdac32f499\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.333663 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.346466 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c097294b-dec5-400e-ba02-ccee86fcdb90-proxy-tls\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.354887 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.368208 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-proxy-tls\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.374495 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381261 4919 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381561 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-profile-collector-cert podName:0aa7f443-95c5-43ae-bd23-f4204d5f8778 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:56.881527618 +0000 UTC m=+224.123408256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-profile-collector-cert") pod "olm-operator-6b444d44fb-cb8tv" (UID: "0aa7f443-95c5-43ae-bd23-f4204d5f8778") : failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381283 4919 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381841 4919 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381858 4919 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381350 4919 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381918 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-profile-collector-cert podName:6ace15bc-71c5-45e7-8791-5d59045c73b9 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:56.881897378 +0000 UTC m=+224.123778006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-profile-collector-cert") pod "catalog-operator-68c6474976-4wvzh" (UID: "6ace15bc-71c5-45e7-8791-5d59045c73b9") : failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381964 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume podName:55ff2223-69b6-4b72-9413-fce0c37ae2b2 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:56.8819539 +0000 UTC m=+224.123834518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume") pod "collect-profiles-29552985-n8r4j" (UID: "55ff2223-69b6-4b72-9413-fce0c37ae2b2") : failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381978 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume podName:55ff2223-69b6-4b72-9413-fce0c37ae2b2 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:56.88197158 +0000 UTC m=+224.123852198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume") pod "collect-profiles-29552985-n8r4j" (UID: "55ff2223-69b6-4b72-9413-fce0c37ae2b2") : failed to sync configmap cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381344 4919 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.382037 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-srv-cert podName:6ace15bc-71c5-45e7-8791-5d59045c73b9 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:56.882028252 +0000 UTC m=+224.123908870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-srv-cert") pod "catalog-operator-68c6474976-4wvzh" (UID: "6ace15bc-71c5-45e7-8791-5d59045c73b9") : failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.382076 4919 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.381373 4919 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.382142 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-cabundle podName:32d0aafd-3bc5-4173-86df-ce624028b1a2 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:56.882132004 +0000 UTC m=+224.124012622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-cabundle") pod "service-ca-9c57cc56f-74h67" (UID: "32d0aafd-3bc5-4173-86df-ce624028b1a2") : failed to sync configmap cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.382250 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-srv-cert podName:0aa7f443-95c5-43ae-bd23-f4204d5f8778 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:56.882204796 +0000 UTC m=+224.124085444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-srv-cert") pod "olm-operator-6b444d44fb-cb8tv" (UID: "0aa7f443-95c5-43ae-bd23-f4204d5f8778") : failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: E0310 21:53:56.382770 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-key podName:32d0aafd-3bc5-4173-86df-ce624028b1a2 nodeName:}" failed. No retries permitted until 2026-03-10 21:53:56.882712091 +0000 UTC m=+224.124592739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-key") pod "service-ca-9c57cc56f-74h67" (UID: "32d0aafd-3bc5-4173-86df-ce624028b1a2") : failed to sync secret cache: timed out waiting for the condition Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.394814 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.416463 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.433902 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.455039 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.473975 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.494954 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.509904 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.515176 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.534234 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.554328 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.566037 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.574846 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.587218 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6"] Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.595667 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.615240 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.635770 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.654839 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.673975 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.694178 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.713926 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.737321 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.757789 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.759485 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zbds9"] Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.774281 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.786932 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2dzqk"] Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.796183 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.814433 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.849460 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.853028 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.875404 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.893275 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.905725 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-cabundle\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.905809 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-srv-cert\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.905834 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.905867 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.905903 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-key\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.905926 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-srv-cert\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.905975 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-profile-collector-cert\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.906047 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.908774 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-cabundle\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.911916 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-profile-collector-cert\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.911993 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.912192 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.912275 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0aa7f443-95c5-43ae-bd23-f4204d5f8778-srv-cert\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.912554 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32d0aafd-3bc5-4173-86df-ce624028b1a2-signing-key\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.912610 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6ace15bc-71c5-45e7-8791-5d59045c73b9-srv-cert\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.913207 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.916907 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.954379 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.973831 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 21:53:56 crc kubenswrapper[4919]: I0310 21:53:56.994625 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.014150 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.036476 4919 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.054348 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.074295 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.093121 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.114604 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.133574 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.167601 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w2zr\" (UniqueName: \"kubernetes.io/projected/23546924-f010-4ffb-8e0a-cde77e6b086f-kube-api-access-4w2zr\") pod \"dns-operator-744455d44c-w5wb5\" (UID: \"23546924-f010-4ffb-8e0a-cde77e6b086f\") " pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.192074 4919 request.go:700] Waited for 1.914807073s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.196841 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mbw4\" (UniqueName: \"kubernetes.io/projected/b00c04d4-1287-409a-8e67-2edb888bf832-kube-api-access-9mbw4\") pod \"oauth-openshift-558db77b4-m9qd4\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.209579 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk858\" (UniqueName: \"kubernetes.io/projected/d8b972b3-02f6-4c31-bb8e-0c229ea48621-kube-api-access-gk858\") pod \"cluster-samples-operator-665b6dd947-zchq4\" (UID: \"d8b972b3-02f6-4c31-bb8e-0c229ea48621\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.239481 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19dc609c-7c35-46d2-b621-34faa138eedd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.256662 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.263081 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2nvl\" (UniqueName: \"kubernetes.io/projected/b5a5c848-4844-4280-a444-9173fff0b8e1-kube-api-access-m2nvl\") pod \"apiserver-7bbb656c7d-qrdwk\" (UID: \"b5a5c848-4844-4280-a444-9173fff0b8e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.271100 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh86p\" (UniqueName: \"kubernetes.io/projected/077de36b-affe-4c2b-905d-38ae64514274-kube-api-access-rh86p\") pod \"openshift-apiserver-operator-796bbdcf4f-ttjzg\" (UID: \"077de36b-affe-4c2b-905d-38ae64514274\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.301079 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c95v2\" (UniqueName: \"kubernetes.io/projected/b2c5effa-6a05-4978-b4a9-1daada6b2465-kube-api-access-c95v2\") pod \"apiserver-76f77b778f-fb6tt\" (UID: \"b2c5effa-6a05-4978-b4a9-1daada6b2465\") " pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.311208 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.319091 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.326764 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfbc7\" (UniqueName: \"kubernetes.io/projected/f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0-kube-api-access-pfbc7\") pod \"openshift-config-operator-7777fb866f-6nwch\" (UID: \"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.331571 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.345028 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs7pm\" (UniqueName: \"kubernetes.io/projected/6350fa6a-9381-4de8-8a8e-4a14b9d253bc-kube-api-access-bs7pm\") pod \"etcd-operator-b45778765-kjw8s\" (UID: \"6350fa6a-9381-4de8-8a8e-4a14b9d253bc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.357160 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggmjx\" (UniqueName: \"kubernetes.io/projected/19dc609c-7c35-46d2-b621-34faa138eedd-kube-api-access-ggmjx\") pod \"cluster-image-registry-operator-dc59b4c8b-7g7s2\" (UID: \"19dc609c-7c35-46d2-b621-34faa138eedd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.383036 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtpc6\" (UniqueName: \"kubernetes.io/projected/69fc2ea4-6491-4109-8f4a-8b8fb369dcce-kube-api-access-dtpc6\") pod \"machine-api-operator-5694c8668f-mvwrl\" (UID: \"69fc2ea4-6491-4109-8f4a-8b8fb369dcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.406737 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg7bw\" (UniqueName: \"kubernetes.io/projected/fb7623ea-ec67-4061-82a6-4099e52fa3b9-kube-api-access-dg7bw\") pod \"downloads-7954f5f757-49z6f\" (UID: \"fb7623ea-ec67-4061-82a6-4099e52fa3b9\") " pod="openshift-console/downloads-7954f5f757-49z6f" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.417648 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8thhg\" (UniqueName: \"kubernetes.io/projected/9657873d-9275-4945-9e91-0b2c2844ae5d-kube-api-access-8thhg\") pod \"console-f9d7485db-58nxf\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.445626 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brnxv\" (UniqueName: \"kubernetes.io/projected/c18a5bd2-fc75-4dce-8ffa-0ba191c52064-kube-api-access-brnxv\") pod \"machine-approver-56656f9798-7skkx\" (UID: \"c18a5bd2-fc75-4dce-8ffa-0ba191c52064\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.453909 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzm74\" (UniqueName: \"kubernetes.io/projected/5e327c09-06d0-420c-b749-50306c5336b3-kube-api-access-xzm74\") pod \"console-operator-58897d9998-h9zcs\" (UID: \"5e327c09-06d0-420c-b749-50306c5336b3\") " pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.472610 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd55m\" (UniqueName: \"kubernetes.io/projected/c097294b-dec5-400e-ba02-ccee86fcdb90-kube-api-access-nd55m\") pod \"machine-config-operator-74547568cd-v2l77\" (UID: \"c097294b-dec5-400e-ba02-ccee86fcdb90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.479099 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" event={"ID":"492e2f21-90e0-4073-b4ad-b562bcf62486","Type":"ContainerStarted","Data":"82736eb7c81aad2652a4c08aa003a166ebbe6309ec0b3d4bce360792e944093d"} Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.479141 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" event={"ID":"492e2f21-90e0-4073-b4ad-b562bcf62486","Type":"ContainerStarted","Data":"b822a29adc63b2ffc38354a6a6a9c1220e089b9bf21baafb801c26afc300396f"} Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.479275 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.489862 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.489902 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.489925 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" event={"ID":"bbf77df7-a88b-49cb-9d1f-10cdac32f499","Type":"ContainerStarted","Data":"b792e5d48fa5cbeaeb22d83178152b3ed6340ccbf0b118e9fe5dc558a4eb71a4"} Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.489949 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" event={"ID":"bbf77df7-a88b-49cb-9d1f-10cdac32f499","Type":"ContainerStarted","Data":"e2dc709cbab2f43e451c941317df2831944d500e269cafd9f6cabdb6fe71dc21"} Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.489966 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" event={"ID":"d809a4c1-5e06-46b7-a39c-466b694361ce","Type":"ContainerStarted","Data":"7daa24c1b8a8fe8086f04d6485368449fdbcfc28d3dd69f60d84d54068cb501f"} Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.489981 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" event={"ID":"d809a4c1-5e06-46b7-a39c-466b694361ce","Type":"ContainerStarted","Data":"8f224cce93fd6990db903a4372ed65a00bf121c1cb5fc85d994d71b934e7eb0d"} Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.492088 4919 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zbds9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.492148 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" podUID="d809a4c1-5e06-46b7-a39c-466b694361ce" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.492741 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w5wb5"] Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.496932 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58cpc\" (UniqueName: \"kubernetes.io/projected/55ff2223-69b6-4b72-9413-fce0c37ae2b2-kube-api-access-58cpc\") pod \"collect-profiles-29552985-n8r4j\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.500643 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.512284 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrrpd\" (UniqueName: \"kubernetes.io/projected/be41b09e-a8ff-4367-a68d-865f047e2549-kube-api-access-nrrpd\") pod \"marketplace-operator-79b997595-tk7xs\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.512826 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.529319 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5c7\" (UniqueName: \"kubernetes.io/projected/ca9a516e-afc2-4475-8af8-23504d17f9a9-kube-api-access-4p5c7\") pod \"router-default-5444994796-hmjhm\" (UID: \"ca9a516e-afc2-4475-8af8-23504d17f9a9\") " pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:57 crc kubenswrapper[4919]: W0310 21:53:57.532235 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23546924_f010_4ffb_8e0a_cde77e6b086f.slice/crio-4959f0246cbc3395fdde26a39c68b4e163e2bca5b9e246b7a439e8fbd86d0fb1 WatchSource:0}: Error finding container 4959f0246cbc3395fdde26a39c68b4e163e2bca5b9e246b7a439e8fbd86d0fb1: Status 404 returned error can't find the container with id 4959f0246cbc3395fdde26a39c68b4e163e2bca5b9e246b7a439e8fbd86d0fb1 Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.534729 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.552061 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njmn2\" (UniqueName: \"kubernetes.io/projected/0c56f0ba-c8bc-41be-8a14-3c57051ebfda-kube-api-access-njmn2\") pod \"machine-config-controller-84d6567774-j6q5w\" (UID: \"0c56f0ba-c8bc-41be-8a14-3c57051ebfda\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:57 crc kubenswrapper[4919]: W0310 21:53:57.559728 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18a5bd2_fc75_4dce_8ffa_0ba191c52064.slice/crio-265e6c2eed301e94c67b3252344eda07c0a29370bc8444f1502d4a4c2ecac2a7 WatchSource:0}: Error finding container 265e6c2eed301e94c67b3252344eda07c0a29370bc8444f1502d4a4c2ecac2a7: Status 404 returned error can't find the container with id 265e6c2eed301e94c67b3252344eda07c0a29370bc8444f1502d4a4c2ecac2a7 Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.569198 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kzz4\" (UniqueName: \"kubernetes.io/projected/6ace15bc-71c5-45e7-8791-5d59045c73b9-kube-api-access-7kzz4\") pod \"catalog-operator-68c6474976-4wvzh\" (UID: \"6ace15bc-71c5-45e7-8791-5d59045c73b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.572731 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.572745 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.579309 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.584350 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fb6tt"] Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.591693 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2jfl\" (UniqueName: \"kubernetes.io/projected/0aa7f443-95c5-43ae-bd23-f4204d5f8778-kube-api-access-n2jfl\") pod \"olm-operator-6b444d44fb-cb8tv\" (UID: \"0aa7f443-95c5-43ae-bd23-f4204d5f8778\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.607744 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.623926 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-49z6f" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.624727 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsbkv\" (UniqueName: \"kubernetes.io/projected/32d0aafd-3bc5-4173-86df-ce624028b1a2-kube-api-access-dsbkv\") pod \"service-ca-9c57cc56f-74h67\" (UID: \"32d0aafd-3bc5-4173-86df-ce624028b1a2\") " pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.627887 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc24k\" (UniqueName: \"kubernetes.io/projected/99f42fb7-eaa5-46d2-9443-81ad7a563cec-kube-api-access-kc24k\") pod \"auto-csr-approver-29552992-kp6wz\" (UID: \"99f42fb7-eaa5-46d2-9443-81ad7a563cec\") " pod="openshift-infra/auto-csr-approver-29552992-kp6wz" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.637949 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.644299 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.651061 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.662963 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.696940 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77"] Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716202 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjlh7\" (UniqueName: \"kubernetes.io/projected/decc4e59-7a45-4f03-bf5e-3ae08d304c61-kube-api-access-hjlh7\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716235 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6010be-2af4-4ced-97bd-bff2e07722a9-apiservice-cert\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716285 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716301 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb2hf\" (UniqueName: \"kubernetes.io/projected/705cef55-4e4d-402a-93e5-d5e880c6424e-kube-api-access-gb2hf\") pod \"multus-admission-controller-857f4d67dd-dwg72\" (UID: \"705cef55-4e4d-402a-93e5-d5e880c6424e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716335 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a6010be-2af4-4ced-97bd-bff2e07722a9-tmpfs\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716358 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqpbs\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-kube-api-access-rqpbs\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716413 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6010be-2af4-4ced-97bd-bff2e07722a9-webhook-cert\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716430 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-trusted-ca\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716446 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716463 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27275499-8d56-431b-9f77-928a700c1dad-serving-cert\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716478 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9b96a2-533c-4a0e-bab8-e63340713c3b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716527 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/decc4e59-7a45-4f03-bf5e-3ae08d304c61-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716559 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9grm\" (UniqueName: \"kubernetes.io/projected/24606597-7caf-479b-b81f-5042f2ba3027-kube-api-access-c9grm\") pod \"migrator-59844c95c7-hgx8b\" (UID: \"24606597-7caf-479b-b81f-5042f2ba3027\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716572 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/decc4e59-7a45-4f03-bf5e-3ae08d304c61-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716587 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgb2\" (UniqueName: \"kubernetes.io/projected/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-kube-api-access-rdgb2\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716602 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d9b96a2-533c-4a0e-bab8-e63340713c3b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716619 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716644 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d403c55b-6082-4056-8111-63f5833b28a6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716661 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsnwh\" (UniqueName: \"kubernetes.io/projected/1c07a718-9218-4471-b909-03975021d691-kube-api-access-hsnwh\") pod \"package-server-manager-789f6589d5-n8bb5\" (UID: \"1c07a718-9218-4471-b909-03975021d691\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716684 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d403c55b-6082-4056-8111-63f5833b28a6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716700 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q8pr\" (UniqueName: \"kubernetes.io/projected/3a6010be-2af4-4ced-97bd-bff2e07722a9-kube-api-access-4q8pr\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716739 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8383a8d8-69ec-4706-8ea3-99ce91e5200c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716755 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvrg5\" (UniqueName: \"kubernetes.io/projected/6d9b96a2-533c-4a0e-bab8-e63340713c3b-kube-api-access-nvrg5\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716771 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjrn\" (UniqueName: \"kubernetes.io/projected/d7afb497-bec3-4211-a9d9-c914438bdf59-kube-api-access-kpjrn\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716787 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-trusted-ca\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716812 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-tls\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716827 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-metrics-tls\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716842 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716896 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9686\" (UniqueName: \"kubernetes.io/projected/27275499-8d56-431b-9f77-928a700c1dad-kube-api-access-q9686\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716914 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a917335-6837-4601-a4e5-0c41252e4d83-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.716974 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/705cef55-4e4d-402a-93e5-d5e880c6424e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dwg72\" (UID: \"705cef55-4e4d-402a-93e5-d5e880c6424e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717012 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27275499-8d56-431b-9f77-928a700c1dad-config\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717087 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d403c55b-6082-4056-8111-63f5833b28a6-config\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717110 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8383a8d8-69ec-4706-8ea3-99ce91e5200c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717143 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a917335-6837-4601-a4e5-0c41252e4d83-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717176 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/425d1938-0668-4f53-aaee-dbc4a93297c7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qnjs8\" (UID: \"425d1938-0668-4f53-aaee-dbc4a93297c7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717201 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a917335-6837-4601-a4e5-0c41252e4d83-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717216 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4927\" (UniqueName: \"kubernetes.io/projected/425d1938-0668-4f53-aaee-dbc4a93297c7-kube-api-access-d4927\") pod \"control-plane-machine-set-operator-78cbb6b69f-qnjs8\" (UID: \"425d1938-0668-4f53-aaee-dbc4a93297c7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717230 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7afb497-bec3-4211-a9d9-c914438bdf59-config-volume\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717257 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-certificates\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717315 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7afb497-bec3-4211-a9d9-c914438bdf59-metrics-tls\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717349 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-config\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717441 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-bound-sa-token\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.717551 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c07a718-9218-4471-b909-03975021d691-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n8bb5\" (UID: \"1c07a718-9218-4471-b909-03975021d691\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:53:57 crc kubenswrapper[4919]: E0310 21:53:57.718753 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.218742155 +0000 UTC m=+225.460622753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.750515 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.757634 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk"] Mar 10 21:53:57 crc kubenswrapper[4919]: W0310 21:53:57.764669 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc097294b_dec5_400e_ba02_ccee86fcdb90.slice/crio-8d02bde085ffa032589fde934c0b3095d9fe0584107803451126670684a5defc WatchSource:0}: Error finding container 8d02bde085ffa032589fde934c0b3095d9fe0584107803451126670684a5defc: Status 404 returned error can't find the container with id 8d02bde085ffa032589fde934c0b3095d9fe0584107803451126670684a5defc Mar 10 21:53:57 crc kubenswrapper[4919]: W0310 21:53:57.767309 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca9a516e_afc2_4475_8af8_23504d17f9a9.slice/crio-16da5256f1356ff22aef93360d167c62c1b1e0963172a03486b553920168b7f3 WatchSource:0}: Error finding container 16da5256f1356ff22aef93360d167c62c1b1e0963172a03486b553920168b7f3: Status 404 returned error can't find the container with id 16da5256f1356ff22aef93360d167c62c1b1e0963172a03486b553920168b7f3 Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.771688 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.778750 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg"] Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.794929 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.803887 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:53:57 crc kubenswrapper[4919]: E0310 21:53:57.818439 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.318415444 +0000 UTC m=+225.560296052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.818470 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.821766 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-metrics-tls\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.821812 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-tls\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.821830 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.821901 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3993e7f-64ea-43e3-a867-7465178fdf99-cert\") pod \"ingress-canary-hfqfv\" (UID: \"f3993e7f-64ea-43e3-a867-7465178fdf99\") " pod="openshift-ingress-canary/ingress-canary-hfqfv" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.821925 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9686\" (UniqueName: \"kubernetes.io/projected/27275499-8d56-431b-9f77-928a700c1dad-kube-api-access-q9686\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.821972 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a917335-6837-4601-a4e5-0c41252e4d83-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.821997 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/705cef55-4e4d-402a-93e5-d5e880c6424e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dwg72\" (UID: \"705cef55-4e4d-402a-93e5-d5e880c6424e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822015 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-csi-data-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822033 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27275499-8d56-431b-9f77-928a700c1dad-config\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822065 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-mountpoint-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822107 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d403c55b-6082-4056-8111-63f5833b28a6-config\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822134 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8383a8d8-69ec-4706-8ea3-99ce91e5200c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822150 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-plugins-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822189 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a917335-6837-4601-a4e5-0c41252e4d83-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822220 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/425d1938-0668-4f53-aaee-dbc4a93297c7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qnjs8\" (UID: \"425d1938-0668-4f53-aaee-dbc4a93297c7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822261 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a917335-6837-4601-a4e5-0c41252e4d83-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822279 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4927\" (UniqueName: \"kubernetes.io/projected/425d1938-0668-4f53-aaee-dbc4a93297c7-kube-api-access-d4927\") pod \"control-plane-machine-set-operator-78cbb6b69f-qnjs8\" (UID: \"425d1938-0668-4f53-aaee-dbc4a93297c7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822295 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7afb497-bec3-4211-a9d9-c914438bdf59-config-volume\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822339 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-certificates\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822381 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-socket-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822413 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7afb497-bec3-4211-a9d9-c914438bdf59-metrics-tls\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822444 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-config\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822468 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-registration-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822549 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-bound-sa-token\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822588 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fsq\" (UniqueName: \"kubernetes.io/projected/f3993e7f-64ea-43e3-a867-7465178fdf99-kube-api-access-99fsq\") pod \"ingress-canary-hfqfv\" (UID: \"f3993e7f-64ea-43e3-a867-7465178fdf99\") " pod="openshift-ingress-canary/ingress-canary-hfqfv" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822622 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56ps7\" (UniqueName: \"kubernetes.io/projected/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-kube-api-access-56ps7\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822655 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c07a718-9218-4471-b909-03975021d691-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n8bb5\" (UID: \"1c07a718-9218-4471-b909-03975021d691\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822689 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjlh7\" (UniqueName: \"kubernetes.io/projected/decc4e59-7a45-4f03-bf5e-3ae08d304c61-kube-api-access-hjlh7\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822713 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6010be-2af4-4ced-97bd-bff2e07722a9-apiservice-cert\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822736 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822756 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb2hf\" (UniqueName: \"kubernetes.io/projected/705cef55-4e4d-402a-93e5-d5e880c6424e-kube-api-access-gb2hf\") pod \"multus-admission-controller-857f4d67dd-dwg72\" (UID: \"705cef55-4e4d-402a-93e5-d5e880c6424e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822783 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a6010be-2af4-4ced-97bd-bff2e07722a9-tmpfs\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822803 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqpbs\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-kube-api-access-rqpbs\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822837 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6010be-2af4-4ced-97bd-bff2e07722a9-webhook-cert\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822865 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-trusted-ca\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822888 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822905 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27275499-8d56-431b-9f77-928a700c1dad-serving-cert\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822929 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9b96a2-533c-4a0e-bab8-e63340713c3b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822950 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/decc4e59-7a45-4f03-bf5e-3ae08d304c61-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.822976 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c72b7da1-06f0-4785-9ea0-713fc758d759-node-bootstrap-token\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823029 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c72b7da1-06f0-4785-9ea0-713fc758d759-certs\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823047 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llmgl\" (UniqueName: \"kubernetes.io/projected/c72b7da1-06f0-4785-9ea0-713fc758d759-kube-api-access-llmgl\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823071 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9grm\" (UniqueName: \"kubernetes.io/projected/24606597-7caf-479b-b81f-5042f2ba3027-kube-api-access-c9grm\") pod \"migrator-59844c95c7-hgx8b\" (UID: \"24606597-7caf-479b-b81f-5042f2ba3027\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823087 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/decc4e59-7a45-4f03-bf5e-3ae08d304c61-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823105 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgb2\" (UniqueName: \"kubernetes.io/projected/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-kube-api-access-rdgb2\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823121 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d9b96a2-533c-4a0e-bab8-e63340713c3b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823142 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823184 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d403c55b-6082-4056-8111-63f5833b28a6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823215 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsnwh\" (UniqueName: \"kubernetes.io/projected/1c07a718-9218-4471-b909-03975021d691-kube-api-access-hsnwh\") pod \"package-server-manager-789f6589d5-n8bb5\" (UID: \"1c07a718-9218-4471-b909-03975021d691\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823254 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d403c55b-6082-4056-8111-63f5833b28a6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823271 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q8pr\" (UniqueName: \"kubernetes.io/projected/3a6010be-2af4-4ced-97bd-bff2e07722a9-kube-api-access-4q8pr\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823307 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8383a8d8-69ec-4706-8ea3-99ce91e5200c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823323 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvrg5\" (UniqueName: \"kubernetes.io/projected/6d9b96a2-533c-4a0e-bab8-e63340713c3b-kube-api-access-nvrg5\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823342 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjrn\" (UniqueName: \"kubernetes.io/projected/d7afb497-bec3-4211-a9d9-c914438bdf59-kube-api-access-kpjrn\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.823381 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-trusted-ca\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.824271 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-trusted-ca\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.825310 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/decc4e59-7a45-4f03-bf5e-3ae08d304c61-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.825876 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3a6010be-2af4-4ced-97bd-bff2e07722a9-tmpfs\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.827209 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9b96a2-533c-4a0e-bab8-e63340713c3b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.829291 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-trusted-ca\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.831290 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8383a8d8-69ec-4706-8ea3-99ce91e5200c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.833343 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m9qd4"] Mar 10 21:53:57 crc kubenswrapper[4919]: E0310 21:53:57.841483 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.3414528 +0000 UTC m=+225.583333408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.843485 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a917335-6837-4601-a4e5-0c41252e4d83-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.844062 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7afb497-bec3-4211-a9d9-c914438bdf59-config-volume\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.844812 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/decc4e59-7a45-4f03-bf5e-3ae08d304c61-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.844838 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27275499-8d56-431b-9f77-928a700c1dad-config\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.845323 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8383a8d8-69ec-4706-8ea3-99ce91e5200c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.845632 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-config\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.846944 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-certificates\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.848152 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d403c55b-6082-4056-8111-63f5833b28a6-config\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.851872 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a917335-6837-4601-a4e5-0c41252e4d83-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.853312 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-74h67" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.859265 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.861847 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d9b96a2-533c-4a0e-bab8-e63340713c3b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.862174 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-metrics-tls\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.862300 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-tls\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.862707 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/425d1938-0668-4f53-aaee-dbc4a93297c7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qnjs8\" (UID: \"425d1938-0668-4f53-aaee-dbc4a93297c7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.863305 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4"] Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.865038 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552992-kp6wz" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.867871 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6010be-2af4-4ced-97bd-bff2e07722a9-webhook-cert\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.868693 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mvwrl"] Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.869959 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/705cef55-4e4d-402a-93e5-d5e880c6424e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dwg72\" (UID: \"705cef55-4e4d-402a-93e5-d5e880c6424e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.870529 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6010be-2af4-4ced-97bd-bff2e07722a9-apiservice-cert\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.870702 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27275499-8d56-431b-9f77-928a700c1dad-serving-cert\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.872064 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d403c55b-6082-4056-8111-63f5833b28a6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.872274 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c07a718-9218-4471-b909-03975021d691-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n8bb5\" (UID: \"1c07a718-9218-4471-b909-03975021d691\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.874603 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7afb497-bec3-4211-a9d9-c914438bdf59-metrics-tls\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.875167 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9grm\" (UniqueName: \"kubernetes.io/projected/24606597-7caf-479b-b81f-5042f2ba3027-kube-api-access-c9grm\") pod \"migrator-59844c95c7-hgx8b\" (UID: \"24606597-7caf-479b-b81f-5042f2ba3027\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.876605 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.888081 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqpbs\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-kube-api-access-rqpbs\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.908630 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.923862 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924059 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-socket-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924091 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-registration-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924119 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99fsq\" (UniqueName: \"kubernetes.io/projected/f3993e7f-64ea-43e3-a867-7465178fdf99-kube-api-access-99fsq\") pod \"ingress-canary-hfqfv\" (UID: \"f3993e7f-64ea-43e3-a867-7465178fdf99\") " pod="openshift-ingress-canary/ingress-canary-hfqfv" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924140 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56ps7\" (UniqueName: \"kubernetes.io/projected/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-kube-api-access-56ps7\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: E0310 21:53:57.924186 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.424162628 +0000 UTC m=+225.666043236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924227 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924310 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c72b7da1-06f0-4785-9ea0-713fc758d759-node-bootstrap-token\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924359 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c72b7da1-06f0-4785-9ea0-713fc758d759-certs\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924384 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llmgl\" (UniqueName: \"kubernetes.io/projected/c72b7da1-06f0-4785-9ea0-713fc758d759-kube-api-access-llmgl\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924479 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-socket-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924525 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-registration-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924525 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3993e7f-64ea-43e3-a867-7465178fdf99-cert\") pod \"ingress-canary-hfqfv\" (UID: \"f3993e7f-64ea-43e3-a867-7465178fdf99\") " pod="openshift-ingress-canary/ingress-canary-hfqfv" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924577 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-csi-data-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924595 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-mountpoint-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924616 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-plugins-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924693 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-plugins-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924798 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-mountpoint-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.924888 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-csi-data-dir\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:57 crc kubenswrapper[4919]: E0310 21:53:57.925161 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.425148354 +0000 UTC m=+225.667028952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.928649 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c72b7da1-06f0-4785-9ea0-713fc758d759-certs\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.937175 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3993e7f-64ea-43e3-a867-7465178fdf99-cert\") pod \"ingress-canary-hfqfv\" (UID: \"f3993e7f-64ea-43e3-a867-7465178fdf99\") " pod="openshift-ingress-canary/ingress-canary-hfqfv" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.945819 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c72b7da1-06f0-4785-9ea0-713fc758d759-node-bootstrap-token\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.953148 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-bound-sa-token\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.967994 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d47e6447-a8e0-4bf3-8317-5c23cff4e6a7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c62fh\" (UID: \"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.970489 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.978295 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjrn\" (UniqueName: \"kubernetes.io/projected/d7afb497-bec3-4211-a9d9-c914438bdf59-kube-api-access-kpjrn\") pod \"dns-default-26f88\" (UID: \"d7afb497-bec3-4211-a9d9-c914438bdf59\") " pod="openshift-dns/dns-default-26f88" Mar 10 21:53:57 crc kubenswrapper[4919]: I0310 21:53:57.986246 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvrg5\" (UniqueName: \"kubernetes.io/projected/6d9b96a2-533c-4a0e-bab8-e63340713c3b-kube-api-access-nvrg5\") pod \"openshift-controller-manager-operator-756b6f6bc6-2xrj5\" (UID: \"6d9b96a2-533c-4a0e-bab8-e63340713c3b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.014708 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h9zcs"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.025976 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.026372 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.526356065 +0000 UTC m=+225.768236673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.033186 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjlh7\" (UniqueName: \"kubernetes.io/projected/decc4e59-7a45-4f03-bf5e-3ae08d304c61-kube-api-access-hjlh7\") pod \"kube-storage-version-migrator-operator-b67b599dd-bbh8l\" (UID: \"decc4e59-7a45-4f03-bf5e-3ae08d304c61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.034655 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9686\" (UniqueName: \"kubernetes.io/projected/27275499-8d56-431b-9f77-928a700c1dad-kube-api-access-q9686\") pod \"service-ca-operator-777779d784-htbkb\" (UID: \"27275499-8d56-431b-9f77-928a700c1dad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.050334 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a917335-6837-4601-a4e5-0c41252e4d83-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kgftl\" (UID: \"8a917335-6837-4601-a4e5-0c41252e4d83\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.057869 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.064196 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.070189 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4927\" (UniqueName: \"kubernetes.io/projected/425d1938-0668-4f53-aaee-dbc4a93297c7-kube-api-access-d4927\") pod \"control-plane-machine-set-operator-78cbb6b69f-qnjs8\" (UID: \"425d1938-0668-4f53-aaee-dbc4a93297c7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.094358 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d403c55b-6082-4056-8111-63f5833b28a6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bftfp\" (UID: \"d403c55b-6082-4056-8111-63f5833b28a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.107911 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb2hf\" (UniqueName: \"kubernetes.io/projected/705cef55-4e4d-402a-93e5-d5e880c6424e-kube-api-access-gb2hf\") pod \"multus-admission-controller-857f4d67dd-dwg72\" (UID: \"705cef55-4e4d-402a-93e5-d5e880c6424e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.116768 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.122763 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-26f88" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.127264 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.127629 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.627613206 +0000 UTC m=+225.869493814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.137987 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsnwh\" (UniqueName: \"kubernetes.io/projected/1c07a718-9218-4471-b909-03975021d691-kube-api-access-hsnwh\") pod \"package-server-manager-789f6589d5-n8bb5\" (UID: \"1c07a718-9218-4471-b909-03975021d691\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.180714 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgb2\" (UniqueName: \"kubernetes.io/projected/3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0-kube-api-access-rdgb2\") pod \"ingress-operator-5b745b69d9-f48z9\" (UID: \"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.228191 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.228926 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56ps7\" (UniqueName: \"kubernetes.io/projected/62cf0e58-2480-43ff-a9aa-f8543fefd9f9-kube-api-access-56ps7\") pod \"csi-hostpathplugin-nv6qp\" (UID: \"62cf0e58-2480-43ff-a9aa-f8543fefd9f9\") " pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.229077 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.729065283 +0000 UTC m=+225.970945891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.229155 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.229577 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.729569787 +0000 UTC m=+225.971450385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.253277 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fsq\" (UniqueName: \"kubernetes.io/projected/f3993e7f-64ea-43e3-a867-7465178fdf99-kube-api-access-99fsq\") pod \"ingress-canary-hfqfv\" (UID: \"f3993e7f-64ea-43e3-a867-7465178fdf99\") " pod="openshift-ingress-canary/ingress-canary-hfqfv" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.259879 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q8pr\" (UniqueName: \"kubernetes.io/projected/3a6010be-2af4-4ced-97bd-bff2e07722a9-kube-api-access-4q8pr\") pod \"packageserver-d55dfcdfc-77k8x\" (UID: \"3a6010be-2af4-4ced-97bd-bff2e07722a9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.260032 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.281382 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llmgl\" (UniqueName: \"kubernetes.io/projected/c72b7da1-06f0-4785-9ea0-713fc758d759-kube-api-access-llmgl\") pod \"machine-config-server-hkjkr\" (UID: \"c72b7da1-06f0-4785-9ea0-713fc758d759\") " pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.321007 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.328682 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.330654 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.330898 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.830881759 +0000 UTC m=+226.072762367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.331198 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.331675 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.831666101 +0000 UTC m=+226.073546709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.334441 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.334741 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.343510 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.388318 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.424980 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.438343 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.438748 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:58.93873365 +0000 UTC m=+226.180614258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.443795 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.483699 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hfqfv" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.495307 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hkjkr" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.525278 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.568853 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.068831755 +0000 UTC m=+226.310712363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.564078 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.583588 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" event={"ID":"b2c5effa-6a05-4978-b4a9-1daada6b2465","Type":"ContainerStarted","Data":"08f1062ce6a3f1d652d4c837882d352e48444bb8bee80245d71118283a022a14"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.602780 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hmjhm" event={"ID":"ca9a516e-afc2-4475-8af8-23504d17f9a9","Type":"ContainerStarted","Data":"16da5256f1356ff22aef93360d167c62c1b1e0963172a03486b553920168b7f3"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.614966 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" event={"ID":"c18a5bd2-fc75-4dce-8ffa-0ba191c52064","Type":"ContainerStarted","Data":"265e6c2eed301e94c67b3252344eda07c0a29370bc8444f1502d4a4c2ecac2a7"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.620248 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-49z6f"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.636963 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.640342 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" event={"ID":"55ff2223-69b6-4b72-9413-fce0c37ae2b2","Type":"ContainerStarted","Data":"d9950d42990d6edde09e55358f5f86128c06d97f8ca78c8a2b63d88cf90f7d6d"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.642173 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tk7xs"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.651512 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjw8s"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.654469 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" event={"ID":"c097294b-dec5-400e-ba02-ccee86fcdb90","Type":"ContainerStarted","Data":"3a7762f688479711d1a8296d32cc182cb4260763a0b5d24ee584e65e7497c075"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.654512 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" event={"ID":"c097294b-dec5-400e-ba02-ccee86fcdb90","Type":"ContainerStarted","Data":"8d02bde085ffa032589fde934c0b3095d9fe0584107803451126670684a5defc"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.662243 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-58nxf"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.684260 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.684777 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.184762656 +0000 UTC m=+226.426643264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.711552 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" podStartSLOduration=163.711532493 podStartE2EDuration="2m43.711532493s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:58.711098012 +0000 UTC m=+225.952978620" watchObservedRunningTime="2026-03-10 21:53:58.711532493 +0000 UTC m=+225.953413101" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.745195 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" event={"ID":"69fc2ea4-6491-4109-8f4a-8b8fb369dcce","Type":"ContainerStarted","Data":"90cbfe332da978fc9bbce8a578c516d1e205c5b9622b9562e9d308f78a01a463"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.759411 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552992-kp6wz"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.766572 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" event={"ID":"23546924-f010-4ffb-8e0a-cde77e6b086f","Type":"ContainerStarted","Data":"50892d36d0ced7283c0d3f5adb75e405d4287509ab25dc86d59073a53855e1e3"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.766609 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" event={"ID":"23546924-f010-4ffb-8e0a-cde77e6b086f","Type":"ContainerStarted","Data":"4959f0246cbc3395fdde26a39c68b4e163e2bca5b9e246b7a439e8fbd86d0fb1"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.773774 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" event={"ID":"b5a5c848-4844-4280-a444-9173fff0b8e1","Type":"ContainerStarted","Data":"8ec33f4ae8fa521a7da9f222895fe7f6e99722a4507b598f83154fd42b7c1d64"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.775454 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h9zcs" event={"ID":"5e327c09-06d0-420c-b749-50306c5336b3","Type":"ContainerStarted","Data":"ddd4c112e2926e2e2b951c795d7a8e90fcaa888eef92d8bc3113ac3e2e3aee6a"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.776486 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" event={"ID":"b00c04d4-1287-409a-8e67-2edb888bf832","Type":"ContainerStarted","Data":"ce5d5f8dcf9a26afe7efcd91551d2cc675f135275680a3d9f398dbd0932c21ac"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.777745 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" event={"ID":"d8b972b3-02f6-4c31-bb8e-0c229ea48621","Type":"ContainerStarted","Data":"5e26e357d54bff94fa88f442d3bf2242c3cce3ac7ecf25833f8336f90b4d2ce2"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.781752 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" event={"ID":"077de36b-affe-4c2b-905d-38ae64514274","Type":"ContainerStarted","Data":"9b56724e0b3c7beef373a5d05631e61241f799af9d2ce26d437b33224ea1c05d"} Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.786290 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.786754 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.286730347 +0000 UTC m=+226.528611155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.788419 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.832238 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.846352 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.849248 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6nwch"] Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.887537 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.888821 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.388806751 +0000 UTC m=+226.630687359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.919783 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-2dzqk" podStartSLOduration=163.919764482 podStartE2EDuration="2m43.919764482s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:58.918820996 +0000 UTC m=+226.160701604" watchObservedRunningTime="2026-03-10 21:53:58.919764482 +0000 UTC m=+226.161645080" Mar 10 21:53:58 crc kubenswrapper[4919]: I0310 21:53:58.989409 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:58 crc kubenswrapper[4919]: E0310 21:53:58.990250 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.490238346 +0000 UTC m=+226.732118954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: W0310 21:53:59.002426 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99f42fb7_eaa5_46d2_9443_81ad7a563cec.slice/crio-25c432ccaf55afa18c9c52acdd64b3e00f55f480c278c8fe53ec4ac7febf5842 WatchSource:0}: Error finding container 25c432ccaf55afa18c9c52acdd64b3e00f55f480c278c8fe53ec4ac7febf5842: Status 404 returned error can't find the container with id 25c432ccaf55afa18c9c52acdd64b3e00f55f480c278c8fe53ec4ac7febf5842 Mar 10 21:53:59 crc kubenswrapper[4919]: W0310 21:53:59.004218 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6350fa6a_9381_4de8_8a8e_4a14b9d253bc.slice/crio-4a72fb071645574f8bbf169c77e00153fd3f0f9dd4e66b1ecb5f59bf7d69ba14 WatchSource:0}: Error finding container 4a72fb071645574f8bbf169c77e00153fd3f0f9dd4e66b1ecb5f59bf7d69ba14: Status 404 returned error can't find the container with id 4a72fb071645574f8bbf169c77e00153fd3f0f9dd4e66b1ecb5f59bf7d69ba14 Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.005582 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.090606 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.090950 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.590933603 +0000 UTC m=+226.832814211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.187868 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.188234 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.192920 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.193938 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.693925672 +0000 UTC m=+226.935806280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.267516 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv"] Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.300958 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.301737 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.801719951 +0000 UTC m=+227.043600559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.322998 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b"] Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.337590 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-htbkb"] Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.339899 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l"] Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.340900 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-74h67"] Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.409055 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.409514 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:53:59.909497 +0000 UTC m=+227.151377618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.472180 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-26f88"] Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.510213 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.510661 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.010641518 +0000 UTC m=+227.252522126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.618867 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.619321 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.11930604 +0000 UTC m=+227.361186648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.643266 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh"] Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.722527 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.722674 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.222651429 +0000 UTC m=+227.464532037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.723194 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.723630 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.223618535 +0000 UTC m=+227.465499143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.767732 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" podStartSLOduration=164.767714823 podStartE2EDuration="2m44.767714823s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:59.767239971 +0000 UTC m=+227.009120579" watchObservedRunningTime="2026-03-10 21:53:59.767714823 +0000 UTC m=+227.009595421" Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.807657 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" event={"ID":"b00c04d4-1287-409a-8e67-2edb888bf832","Type":"ContainerStarted","Data":"3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.808753 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.818442 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-49z6f" event={"ID":"fb7623ea-ec67-4061-82a6-4099e52fa3b9","Type":"ContainerStarted","Data":"d44ca9a728443fc70d6624c52dedc11c82e0f00913209abd10b7e0794a053738"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.818484 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-49z6f" event={"ID":"fb7623ea-ec67-4061-82a6-4099e52fa3b9","Type":"ContainerStarted","Data":"bab7cc555a1825bd928c8305127bbca22a13e45943ffeae2d181b8e4110381fa"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.822220 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-49z6f" Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.823923 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.824345 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.324326752 +0000 UTC m=+227.566207360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.828292 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" event={"ID":"decc4e59-7a45-4f03-bf5e-3ae08d304c61","Type":"ContainerStarted","Data":"9579de8e59c8519d70d90d70ebe852157cb25f5f0659194a6ab0bbde3ebc1cf5"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.841275 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552992-kp6wz" event={"ID":"99f42fb7-eaa5-46d2-9443-81ad7a563cec","Type":"ContainerStarted","Data":"25c432ccaf55afa18c9c52acdd64b3e00f55f480c278c8fe53ec4ac7febf5842"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.856445 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" event={"ID":"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7","Type":"ContainerStarted","Data":"32f8a440d801f9394e17e485e40fd31313bfd40a02eac650fb42b7d6c64a602a"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.866265 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" event={"ID":"c18a5bd2-fc75-4dce-8ffa-0ba191c52064","Type":"ContainerStarted","Data":"f9e6df98448cb1345593fd184b81345d2aa3852b60826384b8a1b68c514476a8"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.882662 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" podStartSLOduration=164.882645217 podStartE2EDuration="2m44.882645217s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:59.881840595 +0000 UTC m=+227.123721203" watchObservedRunningTime="2026-03-10 21:53:59.882645217 +0000 UTC m=+227.124525825" Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.888119 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" event={"ID":"c097294b-dec5-400e-ba02-ccee86fcdb90","Type":"ContainerStarted","Data":"e4e0f8b19db3e34d950adc1cfd57f7457069dbbd6e44da8e488306e23877bab8"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.906829 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8"] Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.912854 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" podStartSLOduration=164.912839187 podStartE2EDuration="2m44.912839187s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:59.910169474 +0000 UTC m=+227.152050082" watchObservedRunningTime="2026-03-10 21:53:59.912839187 +0000 UTC m=+227.154719795" Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.912885 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" event={"ID":"69fc2ea4-6491-4109-8f4a-8b8fb369dcce","Type":"ContainerStarted","Data":"39c7f42b9f9d12179f59769147f9cb7d428ff7b3e4b59dbec3c7b954b17cc1c1"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.912922 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" event={"ID":"69fc2ea4-6491-4109-8f4a-8b8fb369dcce","Type":"ContainerStarted","Data":"82482334cd5c0a34117e7beaee642a979917a4d668e8be2f6f7fcc8c0b9b13d6"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.925573 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" event={"ID":"d8b972b3-02f6-4c31-bb8e-0c229ea48621","Type":"ContainerStarted","Data":"aeff175ea13d7c637c4c53af04d7fcdc763cfe1ec977a71203bf0b5f60d44545"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.926581 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:53:59 crc kubenswrapper[4919]: E0310 21:53:59.926868 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.426855868 +0000 UTC m=+227.668736476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.927981 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.942000 4919 generic.go:334] "Generic (PLEG): container finished" podID="b2c5effa-6a05-4978-b4a9-1daada6b2465" containerID="9f1306115b3ba9f712d12f77658cc55f5fd8f632397fcdb32dde2659b13c5b38" exitCode=0 Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.942115 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" event={"ID":"b2c5effa-6a05-4978-b4a9-1daada6b2465","Type":"ContainerDied","Data":"9f1306115b3ba9f712d12f77658cc55f5fd8f632397fcdb32dde2659b13c5b38"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.947455 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" event={"ID":"23546924-f010-4ffb-8e0a-cde77e6b086f","Type":"ContainerStarted","Data":"8b06f2a119b4085be708ad9c2d5818ca6834fca7f071fd61b560a1aadaa648b5"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.956024 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" event={"ID":"24606597-7caf-479b-b81f-5042f2ba3027","Type":"ContainerStarted","Data":"e2721cf145bd55bbc83a120c8f4924a5057efcae81bd81b49d6c4716aca39c67"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.956956 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-49z6f" podStartSLOduration=164.956935306 podStartE2EDuration="2m44.956935306s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:53:59.950901871 +0000 UTC m=+227.192782469" watchObservedRunningTime="2026-03-10 21:53:59.956935306 +0000 UTC m=+227.198815934" Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.978287 4919 generic.go:334] "Generic (PLEG): container finished" podID="b5a5c848-4844-4280-a444-9173fff0b8e1" containerID="5100429aabf971aaf3f72dc6593cfc7a7cfa2c43b509aea35371f2ac0a040c3b" exitCode=0 Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.978465 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" event={"ID":"b5a5c848-4844-4280-a444-9173fff0b8e1","Type":"ContainerDied","Data":"5100429aabf971aaf3f72dc6593cfc7a7cfa2c43b509aea35371f2ac0a040c3b"} Mar 10 21:53:59 crc kubenswrapper[4919]: I0310 21:53:59.984544 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" event={"ID":"6350fa6a-9381-4de8-8a8e-4a14b9d253bc","Type":"ContainerStarted","Data":"4a72fb071645574f8bbf169c77e00153fd3f0f9dd4e66b1ecb5f59bf7d69ba14"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.006020 4919 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-m9qd4 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.006055 4919 patch_prober.go:28] interesting pod/downloads-7954f5f757-49z6f container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.006071 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.006109 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-49z6f" podUID="fb7623ea-ec67-4061-82a6-4099e52fa3b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.017466 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" event={"ID":"27275499-8d56-431b-9f77-928a700c1dad","Type":"ContainerStarted","Data":"2b3acfe23e3a1b8c714ac3db234dd17517706b991879c9cb8d4e263f101b7524"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.028956 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.029847 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.529818286 +0000 UTC m=+227.771698894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.034341 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.034718 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.534690218 +0000 UTC m=+227.776570826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.040570 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" event={"ID":"0c56f0ba-c8bc-41be-8a14-3c57051ebfda","Type":"ContainerStarted","Data":"7c942366f5dd61f8deb0c85058a7cb7d020101e255af930f2c81b0469ba4fd92"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.043240 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-v2l77" podStartSLOduration=165.04323018 podStartE2EDuration="2m45.04323018s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:00.04286132 +0000 UTC m=+227.284741928" watchObservedRunningTime="2026-03-10 21:54:00.04323018 +0000 UTC m=+227.285110788" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.055718 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" event={"ID":"be41b09e-a8ff-4367-a68d-865f047e2549","Type":"ContainerStarted","Data":"1a9115de1b9b20728ad7a9875a64a22506575a7e4340c4e5055cee337bbf568d"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.072452 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hkjkr" event={"ID":"c72b7da1-06f0-4785-9ea0-713fc758d759","Type":"ContainerStarted","Data":"0c8553a24693534dc549ebdb48cece7916e153fdaf71f214779af58dd966d871"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.099655 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-74h67" event={"ID":"32d0aafd-3bc5-4173-86df-ce624028b1a2","Type":"ContainerStarted","Data":"d9f592a0023f1416e0811c2fac9307dab26a6d97cbcef18f6695610252dc98bf"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.116495 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26f88" event={"ID":"d7afb497-bec3-4211-a9d9-c914438bdf59","Type":"ContainerStarted","Data":"95f5229f94bf848174db21fd22183efdec9d556cf75419c792c06ddb45da8c83"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.128626 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" event={"ID":"6ace15bc-71c5-45e7-8791-5d59045c73b9","Type":"ContainerStarted","Data":"ae337511d41748acb8463b24741355fc20d13976a1ccbfcfd28752a91f776286"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.135400 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.136226 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.636202847 +0000 UTC m=+227.878083465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.140373 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.145915 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.645901491 +0000 UTC m=+227.887782099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.146727 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h9zcs" event={"ID":"5e327c09-06d0-420c-b749-50306c5336b3","Type":"ContainerStarted","Data":"5439290f33ce4f979e777820e28b97c50d90fcf16920f81c42fd87153d191928"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.148027 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.159705 4919 patch_prober.go:28] interesting pod/console-operator-58897d9998-h9zcs container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.159769 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h9zcs" podUID="5e327c09-06d0-420c-b749-50306c5336b3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.161514 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" event={"ID":"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0","Type":"ContainerStarted","Data":"3cb7873b5b3854c8fc0cfdcf22d52a7793ae72dfa12bd59e2a1c1999bb056308"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.173686 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-w5wb5" podStartSLOduration=165.173666614 podStartE2EDuration="2m45.173666614s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:00.160539408 +0000 UTC m=+227.402420016" watchObservedRunningTime="2026-03-10 21:54:00.173666614 +0000 UTC m=+227.415547222" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.175748 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552994-rvxmh"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.176301 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.182678 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.191495 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552994-rvxmh"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.213928 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ttjzg" event={"ID":"077de36b-affe-4c2b-905d-38ae64514274","Type":"ContainerStarted","Data":"df27bf433ebb32ce34fb33bc3703869945e3fe5c790d79ea7891c2411140ebf8"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.240449 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mvwrl" podStartSLOduration=165.240433369 podStartE2EDuration="2m45.240433369s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:00.235891006 +0000 UTC m=+227.477771614" watchObservedRunningTime="2026-03-10 21:54:00.240433369 +0000 UTC m=+227.482313977" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.244975 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" event={"ID":"55ff2223-69b6-4b72-9413-fce0c37ae2b2","Type":"ContainerStarted","Data":"ff6c5d94828d4986d3c2921507606c68566c8b8f1db53d6e9faed67db575e663"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.246775 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.247735 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.747722737 +0000 UTC m=+227.989603345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.263261 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" event={"ID":"19dc609c-7c35-46d2-b621-34faa138eedd","Type":"ContainerStarted","Data":"16c5f356ac1ae9007ec3ba1ac4f6f6069dbd29840d0a1234461d9f5e5db8bc5c"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.263300 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" event={"ID":"19dc609c-7c35-46d2-b621-34faa138eedd","Type":"ContainerStarted","Data":"f23da023ee368118aa7548c5d72bb2089a2637ef6403eec46fe99fa54d7fc23e"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.287668 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-h9zcs" podStartSLOduration=165.287654392 podStartE2EDuration="2m45.287654392s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:00.285826512 +0000 UTC m=+227.527707130" watchObservedRunningTime="2026-03-10 21:54:00.287654392 +0000 UTC m=+227.529534990" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.298956 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-58nxf" event={"ID":"9657873d-9275-4945-9e91-0b2c2844ae5d","Type":"ContainerStarted","Data":"955ca8edae6c79d599f2bf02aecbfedc235309a0fb0e56cd8be38b4fb21c310e"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.302677 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hmjhm" event={"ID":"ca9a516e-afc2-4475-8af8-23504d17f9a9","Type":"ContainerStarted","Data":"a38e5af1d0c7d64477ffa2081f4f5bb3fcbe087bcf6e4513223ee4e66cd57074"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.326413 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" event={"ID":"0aa7f443-95c5-43ae-bd23-f4204d5f8778","Type":"ContainerStarted","Data":"85b4b1461deb2e3b3b365245c8a398b48fd9041e5adacf77786e2f5154280bd7"} Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.343537 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hmjhm" podStartSLOduration=165.34351649 podStartE2EDuration="2m45.34351649s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:00.341108385 +0000 UTC m=+227.582988993" watchObservedRunningTime="2026-03-10 21:54:00.34351649 +0000 UTC m=+227.585397118" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.348551 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp685\" (UniqueName: \"kubernetes.io/projected/cac0bc08-6186-43fb-bebe-036c98331599-kube-api-access-rp685\") pod \"auto-csr-approver-29552994-rvxmh\" (UID: \"cac0bc08-6186-43fb-bebe-036c98331599\") " pod="openshift-infra/auto-csr-approver-29552994-rvxmh" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.348608 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.348973 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.848943508 +0000 UTC m=+228.090824116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.409440 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7g7s2" podStartSLOduration=165.409323749 podStartE2EDuration="2m45.409323749s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:00.404772514 +0000 UTC m=+227.646653122" watchObservedRunningTime="2026-03-10 21:54:00.409323749 +0000 UTC m=+227.651204357" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.410411 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" podStartSLOduration=165.410405317 podStartE2EDuration="2m45.410405317s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:00.369932438 +0000 UTC m=+227.611813046" watchObservedRunningTime="2026-03-10 21:54:00.410405317 +0000 UTC m=+227.652285915" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.449451 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.449674 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp685\" (UniqueName: \"kubernetes.io/projected/cac0bc08-6186-43fb-bebe-036c98331599-kube-api-access-rp685\") pod \"auto-csr-approver-29552994-rvxmh\" (UID: \"cac0bc08-6186-43fb-bebe-036c98331599\") " pod="openshift-infra/auto-csr-approver-29552994-rvxmh" Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.450563 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:00.950549348 +0000 UTC m=+228.192429956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.472592 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.534197 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp685\" (UniqueName: \"kubernetes.io/projected/cac0bc08-6186-43fb-bebe-036c98331599-kube-api-access-rp685\") pod \"auto-csr-approver-29552994-rvxmh\" (UID: \"cac0bc08-6186-43fb-bebe-036c98331599\") " pod="openshift-infra/auto-csr-approver-29552994-rvxmh" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.551680 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.551976 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.051965455 +0000 UTC m=+228.293846063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.641596 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.653552 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.654002 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.153987696 +0000 UTC m=+228.395868304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.658484 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.663941 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.698698 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:00 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:00 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:00 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.698764 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.701624 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.708061 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.758241 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.758279 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.25826378 +0000 UTC m=+228.500144388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.763340 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.773527 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nv6qp"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.773566 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dwg72"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.776058 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hfqfv"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.827005 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl"] Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.858859 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.859600 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.359585213 +0000 UTC m=+228.601465821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:00 crc kubenswrapper[4919]: I0310 21:54:00.962006 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:00 crc kubenswrapper[4919]: E0310 21:54:00.962344 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.462331096 +0000 UTC m=+228.704211704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: W0310 21:54:01.008863 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a917335_6837_4601_a4e5_0c41252e4d83.slice/crio-1a5d49c8bca486ab8e6a444fe41388761efcc204e378e76ec92bf83d58bf8394 WatchSource:0}: Error finding container 1a5d49c8bca486ab8e6a444fe41388761efcc204e378e76ec92bf83d58bf8394: Status 404 returned error can't find the container with id 1a5d49c8bca486ab8e6a444fe41388761efcc204e378e76ec92bf83d58bf8394 Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.063455 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.064743 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.564717118 +0000 UTC m=+228.806597736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.169672 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.169982 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.669972538 +0000 UTC m=+228.911853146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.274629 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.275887 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.775870696 +0000 UTC m=+229.017751304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.377830 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.378366 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.87834314 +0000 UTC m=+229.120223748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.382195 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" event={"ID":"0c56f0ba-c8bc-41be-8a14-3c57051ebfda","Type":"ContainerStarted","Data":"9e4d702c32ed4bcd3e705dbbd476a9636924bf466a0a6572393874768639697f"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.430891 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552994-rvxmh"] Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.440853 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" event={"ID":"c18a5bd2-fc75-4dce-8ffa-0ba191c52064","Type":"ContainerStarted","Data":"960e8e2518e2dfb46a410ffa36c2bff4b423fcf86dbb5972e6bf2faea8952919"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.465192 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" event={"ID":"27275499-8d56-431b-9f77-928a700c1dad","Type":"ContainerStarted","Data":"6f0c6ef1f8c539c8039ce81cecbd4a753a7c80a9c0141edafbc53ff445dfa3e5"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.478932 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:01 crc kubenswrapper[4919]: W0310 21:54:01.479050 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac0bc08_6186_43fb_bebe_036c98331599.slice/crio-616e74b4ce6da29f29710110acc0f985b9f975a960bd4183894373f41cc06a5a WatchSource:0}: Error finding container 616e74b4ce6da29f29710110acc0f985b9f975a960bd4183894373f41cc06a5a: Status 404 returned error can't find the container with id 616e74b4ce6da29f29710110acc0f985b9f975a960bd4183894373f41cc06a5a Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.479228 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:01.979214061 +0000 UTC m=+229.221094669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.480444 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" event={"ID":"d403c55b-6082-4056-8111-63f5833b28a6","Type":"ContainerStarted","Data":"a8c03fbac5ed7083f1ae50404829bc15dceef7106d1850df90b11bd10f005696"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.497439 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htbkb" podStartSLOduration=166.497421296 podStartE2EDuration="2m46.497421296s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.497326684 +0000 UTC m=+228.739207292" watchObservedRunningTime="2026-03-10 21:54:01.497421296 +0000 UTC m=+228.739301904" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.499097 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7skkx" podStartSLOduration=166.499091431 podStartE2EDuration="2m46.499091431s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.46369793 +0000 UTC m=+228.705578538" watchObservedRunningTime="2026-03-10 21:54:01.499091431 +0000 UTC m=+228.740972039" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.516038 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" event={"ID":"62cf0e58-2480-43ff-a9aa-f8543fefd9f9","Type":"ContainerStarted","Data":"247e35cc87bb88de78af073a1b9d333404dc2738c3d1ea60bb8d6514bc27f502"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.523169 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" event={"ID":"24606597-7caf-479b-b81f-5042f2ba3027","Type":"ContainerStarted","Data":"629a94f3f85e325d454beda9f2357bc287c4a147f94d9b694e5e30c5f2c321fa"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.523213 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" event={"ID":"24606597-7caf-479b-b81f-5042f2ba3027","Type":"ContainerStarted","Data":"2333aa678945ae0051c0bb25ebaa83969423743e3ce51498c2e53d0c1af2d240"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.580790 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.582107 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.082090026 +0000 UTC m=+229.323970634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.589750 4919 generic.go:334] "Generic (PLEG): container finished" podID="f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0" containerID="5cce7e857e09c2c86e515d74f91505a3e9694c8c4b829def88d58df9f979ad59" exitCode=0 Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.590017 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" event={"ID":"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0","Type":"ContainerDied","Data":"5cce7e857e09c2c86e515d74f91505a3e9694c8c4b829def88d58df9f979ad59"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.615384 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" event={"ID":"b2c5effa-6a05-4978-b4a9-1daada6b2465","Type":"ContainerStarted","Data":"7bc77267815fa5f2d6aa5886a599f07c9c7a937313b1b497905307cd608436f7"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.618376 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hgx8b" podStartSLOduration=166.618365902 podStartE2EDuration="2m46.618365902s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.550751725 +0000 UTC m=+228.792632333" watchObservedRunningTime="2026-03-10 21:54:01.618365902 +0000 UTC m=+228.860246510" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.648098 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" event={"ID":"1c07a718-9218-4471-b909-03975021d691","Type":"ContainerStarted","Data":"f6a7c3dc325b066136476bd023c45253a9d3886f5031d7f4109e00f0553ac384"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.648145 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" event={"ID":"1c07a718-9218-4471-b909-03975021d691","Type":"ContainerStarted","Data":"ff862c9068e31fe11981e2d32e6391aa28650e423e824ceb945d9ccf72535946"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.666896 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hkjkr" event={"ID":"c72b7da1-06f0-4785-9ea0-713fc758d759","Type":"ContainerStarted","Data":"591b5650df7443e17bd5328ffb2bd5c675b9a5ac9efde50c5acae9ada6f1518e"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.667678 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:01 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:01 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:01 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.667706 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.672775 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" event={"ID":"3a6010be-2af4-4ced-97bd-bff2e07722a9","Type":"ContainerStarted","Data":"9f2edd416ca46aaf49216b970130a31c78aec387a335fb00c752e75be1ed3693"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.674071 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.675627 4919 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-77k8x container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.675654 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" podUID="3a6010be-2af4-4ced-97bd-bff2e07722a9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.677796 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" event={"ID":"705cef55-4e4d-402a-93e5-d5e880c6424e","Type":"ContainerStarted","Data":"fcb60181e592519d59439f75a2b9661d8a114723124834fba43a289b0a27d736"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.689807 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.693184 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hkjkr" podStartSLOduration=6.693169995 podStartE2EDuration="6.693169995s" podCreationTimestamp="2026-03-10 21:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.689276299 +0000 UTC m=+228.931156897" watchObservedRunningTime="2026-03-10 21:54:01.693169995 +0000 UTC m=+228.935050603" Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.695171 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.195155659 +0000 UTC m=+229.437036267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.696533 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.698748 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.198734817 +0000 UTC m=+229.440615425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.699587 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hfqfv" event={"ID":"f3993e7f-64ea-43e3-a867-7465178fdf99","Type":"ContainerStarted","Data":"2e63421734e3d7f53ba3f809efe67ee791d240f806565a015f0fc93c24f165a0"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.715063 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" event={"ID":"decc4e59-7a45-4f03-bf5e-3ae08d304c61","Type":"ContainerStarted","Data":"f6d64686fec32b52772e1f198cd063471e030a041976e0c873c0d4836479ab64"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.740603 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-74h67" event={"ID":"32d0aafd-3bc5-4173-86df-ce624028b1a2","Type":"ContainerStarted","Data":"3ff1cd78e8de8372e8f6af1decc46e3c88633f034dd2a40dae8cfd296c351844"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.751105 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" podStartSLOduration=166.751087049 podStartE2EDuration="2m46.751087049s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.715823841 +0000 UTC m=+228.957704459" watchObservedRunningTime="2026-03-10 21:54:01.751087049 +0000 UTC m=+228.992967647" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.755287 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" event={"ID":"0aa7f443-95c5-43ae-bd23-f4204d5f8778","Type":"ContainerStarted","Data":"d9151d37b1867bb6abfd3c7cb09c2e35b7b1cfeb506ec76e5f20dfc023e79df5"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.756726 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.759468 4919 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cb8tv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.759534 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" podUID="0aa7f443-95c5-43ae-bd23-f4204d5f8778" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.771458 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bbh8l" podStartSLOduration=166.771443643 podStartE2EDuration="2m46.771443643s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.751550581 +0000 UTC m=+228.993431189" watchObservedRunningTime="2026-03-10 21:54:01.771443643 +0000 UTC m=+229.013324251" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.798477 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.798799 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" event={"ID":"d8b972b3-02f6-4c31-bb8e-0c229ea48621","Type":"ContainerStarted","Data":"934570154ff86d132346522cc4c7f2e00cd3ee925a64052a8fe9fb6aa2383fe3"} Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.799902 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.299885005 +0000 UTC m=+229.541765613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.812526 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" podStartSLOduration=166.812509388 podStartE2EDuration="2m46.812509388s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.80594899 +0000 UTC m=+229.047829618" watchObservedRunningTime="2026-03-10 21:54:01.812509388 +0000 UTC m=+229.054389996" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.815720 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" event={"ID":"6d9b96a2-533c-4a0e-bab8-e63340713c3b","Type":"ContainerStarted","Data":"91fd8deb700b206d56958214ef14990702cb212cfff12e5e5d580a7fc268a62c"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.819463 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" event={"ID":"6350fa6a-9381-4de8-8a8e-4a14b9d253bc","Type":"ContainerStarted","Data":"39a156cf4cc9b5d7ee50cc964655a21aa951e1d66f9519c4b7e39a0ed1d4cde3"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.825127 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-74h67" podStartSLOduration=166.82510756 podStartE2EDuration="2m46.82510756s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.774551027 +0000 UTC m=+229.016431635" watchObservedRunningTime="2026-03-10 21:54:01.82510756 +0000 UTC m=+229.066988168" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.830243 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" event={"ID":"be41b09e-a8ff-4367-a68d-865f047e2549","Type":"ContainerStarted","Data":"79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.830409 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.835492 4919 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tk7xs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.835546 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" podUID="be41b09e-a8ff-4367-a68d-865f047e2549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.841368 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zchq4" podStartSLOduration=166.841355241 podStartE2EDuration="2m46.841355241s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.839556913 +0000 UTC m=+229.081437531" watchObservedRunningTime="2026-03-10 21:54:01.841355241 +0000 UTC m=+229.083235839" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.846843 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" event={"ID":"8a917335-6837-4601-a4e5-0c41252e4d83","Type":"ContainerStarted","Data":"1a5d49c8bca486ab8e6a444fe41388761efcc204e378e76ec92bf83d58bf8394"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.849453 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" event={"ID":"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0","Type":"ContainerStarted","Data":"94a898011860749996809b4bfa4d8b1d5f2ca89a401a80bbf3b3b6cb71d391cc"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.849490 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" event={"ID":"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0","Type":"ContainerStarted","Data":"d1f16f288e8f058d95d986c4a66af6334020e60ca4e26971f3e6354b30824490"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.857804 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" podStartSLOduration=166.857782008 podStartE2EDuration="2m46.857782008s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.856738119 +0000 UTC m=+229.098618727" watchObservedRunningTime="2026-03-10 21:54:01.857782008 +0000 UTC m=+229.099662626" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.858757 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" event={"ID":"b5a5c848-4844-4280-a444-9173fff0b8e1","Type":"ContainerStarted","Data":"3bcf48befcabc8a0c23ebdd631768363a86c345674aa5d4ba28b7a0471a37334"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.877571 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" event={"ID":"6ace15bc-71c5-45e7-8791-5d59045c73b9","Type":"ContainerStarted","Data":"aeac0e4210c63b5764aef355878854deb012bde7ea734c27031274243759213c"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.878126 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.894619 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-58nxf" event={"ID":"9657873d-9275-4945-9e91-0b2c2844ae5d","Type":"ContainerStarted","Data":"49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.895543 4919 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4wvzh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.895578 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" podUID="6ace15bc-71c5-45e7-8791-5d59045c73b9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.895600 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-kjw8s" podStartSLOduration=166.895589115 podStartE2EDuration="2m46.895589115s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.895123363 +0000 UTC m=+229.137003971" watchObservedRunningTime="2026-03-10 21:54:01.895589115 +0000 UTC m=+229.137469733" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.900338 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:01 crc kubenswrapper[4919]: E0310 21:54:01.901796 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.401785114 +0000 UTC m=+229.643665722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.905645 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26f88" event={"ID":"d7afb497-bec3-4211-a9d9-c914438bdf59","Type":"ContainerStarted","Data":"4c2405bc9a877164b2a18da2fda5481a458bc9d8c78c0e7906641733a16e50f1"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.951662 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" event={"ID":"d47e6447-a8e0-4bf3-8317-5c23cff4e6a7","Type":"ContainerStarted","Data":"d9ab2bd588f6966eda0bf1993f4bfaf3d88033f1b82c1a7372eb130b50293b0f"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.951781 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" podStartSLOduration=166.951769522 podStartE2EDuration="2m46.951769522s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.921735636 +0000 UTC m=+229.163616254" watchObservedRunningTime="2026-03-10 21:54:01.951769522 +0000 UTC m=+229.193650130" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.975169 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" event={"ID":"425d1938-0668-4f53-aaee-dbc4a93297c7","Type":"ContainerStarted","Data":"9faa19b627ff790b710212f848e41727b16cc287a68cab851d6417c61400e13f"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.975207 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" event={"ID":"425d1938-0668-4f53-aaee-dbc4a93297c7","Type":"ContainerStarted","Data":"92d0ceb14758cb7ea45204d2e5fc3832bcd41759fc854c0529f6012494822059"} Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.977028 4919 patch_prober.go:28] interesting pod/downloads-7954f5f757-49z6f container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.977055 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-49z6f" podUID="fb7623ea-ec67-4061-82a6-4099e52fa3b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.984986 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-58nxf" podStartSLOduration=166.984969424 podStartE2EDuration="2m46.984969424s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.98440744 +0000 UTC m=+229.226288048" watchObservedRunningTime="2026-03-10 21:54:01.984969424 +0000 UTC m=+229.226850032" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.986184 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" podStartSLOduration=166.986180027 podStartE2EDuration="2m46.986180027s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:01.953531311 +0000 UTC m=+229.195411909" watchObservedRunningTime="2026-03-10 21:54:01.986180027 +0000 UTC m=+229.228060635" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.986726 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:54:01 crc kubenswrapper[4919]: I0310 21:54:01.998757 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-h9zcs" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.003495 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.005775 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.505750479 +0000 UTC m=+229.747631087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.019564 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" podStartSLOduration=167.019535294 podStartE2EDuration="2m47.019535294s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:02.011477965 +0000 UTC m=+229.253358573" watchObservedRunningTime="2026-03-10 21:54:02.019535294 +0000 UTC m=+229.261415902" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.075696 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qnjs8" podStartSLOduration=167.075679499 podStartE2EDuration="2m47.075679499s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:02.074316342 +0000 UTC m=+229.316196950" watchObservedRunningTime="2026-03-10 21:54:02.075679499 +0000 UTC m=+229.317560107" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.106989 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.107346 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.607334929 +0000 UTC m=+229.849215537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.148909 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c62fh" podStartSLOduration=167.148876118 podStartE2EDuration="2m47.148876118s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:02.105703465 +0000 UTC m=+229.347584073" watchObservedRunningTime="2026-03-10 21:54:02.148876118 +0000 UTC m=+229.390756726" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.213341 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.213705 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.713687769 +0000 UTC m=+229.955568377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.275740 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58506: no serving certificate available for the kubelet" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.315314 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.315677 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.81566459 +0000 UTC m=+230.057545198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.373305 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58514: no serving certificate available for the kubelet" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.416365 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.416565 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.916546732 +0000 UTC m=+230.158427340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.416626 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.417075 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:02.917056026 +0000 UTC m=+230.158936634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.492313 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58528: no serving certificate available for the kubelet" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.513349 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.513413 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.518188 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.518429 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.018349328 +0000 UTC m=+230.260229936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.518718 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.519060 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.019043457 +0000 UTC m=+230.260924065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.590667 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58532: no serving certificate available for the kubelet" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.620308 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.620497 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.120455623 +0000 UTC m=+230.362336231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.620594 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.620955 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.120947186 +0000 UTC m=+230.362827794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.668287 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:02 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:02 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:02 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.668376 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.721972 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.722294 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.222249679 +0000 UTC m=+230.464130287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.751029 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58534: no serving certificate available for the kubelet" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.823852 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.824490 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.324443186 +0000 UTC m=+230.566323994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.924679 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:02 crc kubenswrapper[4919]: E0310 21:54:02.925215 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.425199204 +0000 UTC m=+230.667079812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.931639 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58540: no serving certificate available for the kubelet" Mar 10 21:54:02 crc kubenswrapper[4919]: I0310 21:54:02.997819 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" event={"ID":"b2c5effa-6a05-4978-b4a9-1daada6b2465","Type":"ContainerStarted","Data":"d4c7308328b0e62afcb5794f0055fa0c31014a801fa59e646ad2b72bb19789b8"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.018657 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26f88" event={"ID":"d7afb497-bec3-4211-a9d9-c914438bdf59","Type":"ContainerStarted","Data":"32d56e32e5c7dedf606080defe18c01531ee6d4cecd206f7b170c8e0a7da1f4b"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.019568 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-26f88" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.026783 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.028075 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.528062779 +0000 UTC m=+230.769943387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.036322 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hfqfv" event={"ID":"f3993e7f-64ea-43e3-a867-7465178fdf99","Type":"ContainerStarted","Data":"7b510f194055bc37820ef84aa6b5a55bfd14e07045412812bf7b8607e158dbdd"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.061154 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2xrj5" event={"ID":"6d9b96a2-533c-4a0e-bab8-e63340713c3b","Type":"ContainerStarted","Data":"90bfb8ecdf535cb4fe50221f399c6822b38d160c7d78b78650dbc1365a60e526"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.069596 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" podStartSLOduration=168.069581708 podStartE2EDuration="2m48.069581708s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.043275233 +0000 UTC m=+230.285155841" watchObservedRunningTime="2026-03-10 21:54:03.069581708 +0000 UTC m=+230.311462566" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.084681 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" event={"ID":"cac0bc08-6186-43fb-bebe-036c98331599","Type":"ContainerStarted","Data":"616e74b4ce6da29f29710110acc0f985b9f975a960bd4183894373f41cc06a5a"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.098431 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" event={"ID":"f779c56d-d87a-4e36-8a8d-f9d93bb2b7e0","Type":"ContainerStarted","Data":"e585758351c437e0569d182809eb9ed50644253555ba6e6b7c005ba27916515a"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.100834 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" event={"ID":"705cef55-4e4d-402a-93e5-d5e880c6424e","Type":"ContainerStarted","Data":"543072bc45008c493a9939ef43d81fe214f79b1d1827803efdc2ceab1fc1d657"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.100871 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" event={"ID":"705cef55-4e4d-402a-93e5-d5e880c6424e","Type":"ContainerStarted","Data":"0854b11ccf09b3733008e28be1c37c0a5b2063aed24d0f9a0ca73e8dcb8c6723"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.111213 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" event={"ID":"8a917335-6837-4601-a4e5-0c41252e4d83","Type":"ContainerStarted","Data":"d12ffee8a4d49eee5bc602b4c9dedd4210b705ce271eeb6939b8317f9ce39408"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.112905 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hfqfv" podStartSLOduration=8.112892635 podStartE2EDuration="8.112892635s" podCreationTimestamp="2026-03-10 21:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.112806832 +0000 UTC m=+230.354687440" watchObservedRunningTime="2026-03-10 21:54:03.112892635 +0000 UTC m=+230.354773233" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.113349 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-26f88" podStartSLOduration=9.113344837 podStartE2EDuration="9.113344837s" podCreationTimestamp="2026-03-10 21:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.068554259 +0000 UTC m=+230.310434867" watchObservedRunningTime="2026-03-10 21:54:03.113344837 +0000 UTC m=+230.355225445" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.119003 4919 generic.go:334] "Generic (PLEG): container finished" podID="55ff2223-69b6-4b72-9413-fce0c37ae2b2" containerID="ff6c5d94828d4986d3c2921507606c68566c8b8f1db53d6e9faed67db575e663" exitCode=0 Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.119180 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" event={"ID":"55ff2223-69b6-4b72-9413-fce0c37ae2b2","Type":"ContainerDied","Data":"ff6c5d94828d4986d3c2921507606c68566c8b8f1db53d6e9faed67db575e663"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.129067 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.129335 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.62930753 +0000 UTC m=+230.871188138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.129712 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.130114 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.630101672 +0000 UTC m=+230.871982280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.131641 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" event={"ID":"62cf0e58-2480-43ff-a9aa-f8543fefd9f9","Type":"ContainerStarted","Data":"dbe58077bee4a6aec0637492f00be33cbceef76b895084e082f7fa218a1f60c3"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.153038 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" event={"ID":"0c56f0ba-c8bc-41be-8a14-3c57051ebfda","Type":"ContainerStarted","Data":"a390842c80cffeff63f4e2af00bf1d3d784c64fdfb6c97359150fb582d1ee74f"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.164593 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" event={"ID":"3a6010be-2af4-4ced-97bd-bff2e07722a9","Type":"ContainerStarted","Data":"34694dafbbf68fb9493936fb120cdcc574f38a404afad1154224e94d6942c184"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.178717 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58556: no serving certificate available for the kubelet" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.209287 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" event={"ID":"1c07a718-9218-4471-b909-03975021d691","Type":"ContainerStarted","Data":"02709cdcc208a9776623b6b51f3c431a2d9539866791c76cbc5124b60f949951"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.209929 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.210369 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dwg72" podStartSLOduration=168.210350902 podStartE2EDuration="2m48.210350902s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.210076925 +0000 UTC m=+230.451957523" watchObservedRunningTime="2026-03-10 21:54:03.210350902 +0000 UTC m=+230.452231510" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.211065 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kgftl" podStartSLOduration=168.211057662 podStartE2EDuration="2m48.211057662s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.170666134 +0000 UTC m=+230.412546742" watchObservedRunningTime="2026-03-10 21:54:03.211057662 +0000 UTC m=+230.452938270" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.230731 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.231465 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.731438505 +0000 UTC m=+230.973319113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.244747 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" podStartSLOduration=168.244699856 podStartE2EDuration="2m48.244699856s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.242612879 +0000 UTC m=+230.484493487" watchObservedRunningTime="2026-03-10 21:54:03.244699856 +0000 UTC m=+230.486580464" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.245848 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" event={"ID":"d403c55b-6082-4056-8111-63f5833b28a6","Type":"ContainerStarted","Data":"1efb1eea7f8a2e5570223c524d8ff4af48e06713f2981c6d77f6e989d763964b"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.254397 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" event={"ID":"3fcbd6a7-5b9f-4a6a-8705-a2110bc0d7e0","Type":"ContainerStarted","Data":"795676a687d9355d3a0318608f38116641d05a9b140a9d1d03494cd602c3dd08"} Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.279767 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wvzh" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.298338 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cb8tv" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.298454 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.312932 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j6q5w" podStartSLOduration=168.312914599 podStartE2EDuration="2m48.312914599s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.309861626 +0000 UTC m=+230.551742234" watchObservedRunningTime="2026-03-10 21:54:03.312914599 +0000 UTC m=+230.554795207" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.317099 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.336162 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.343966 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.843951323 +0000 UTC m=+231.085831931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.385982 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" podStartSLOduration=168.385968145 podStartE2EDuration="2m48.385968145s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.363903885 +0000 UTC m=+230.605784493" watchObservedRunningTime="2026-03-10 21:54:03.385968145 +0000 UTC m=+230.627848753" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.422462 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f48z9" podStartSLOduration=168.422406415 podStartE2EDuration="2m48.422406415s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.421789758 +0000 UTC m=+230.663670356" watchObservedRunningTime="2026-03-10 21:54:03.422406415 +0000 UTC m=+230.664287023" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.439853 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.441567 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:03.941551955 +0000 UTC m=+231.183432553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.548406 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58562: no serving certificate available for the kubelet" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.548484 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bftfp" podStartSLOduration=168.548466711 podStartE2EDuration="2m48.548466711s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:03.519031571 +0000 UTC m=+230.760912189" watchObservedRunningTime="2026-03-10 21:54:03.548466711 +0000 UTC m=+230.790347319" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.549115 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-77k8x" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.549802 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.550206 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.050181127 +0000 UTC m=+231.292061725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.619288 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.651618 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.651988 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.151974003 +0000 UTC m=+231.393854611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.669380 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:03 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:03 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:03 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.669460 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.752683 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.753323 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.253305606 +0000 UTC m=+231.495186214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.787234 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pw22n"] Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.788329 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.792887 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.809312 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pw22n"] Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.860931 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.861110 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.361069495 +0000 UTC m=+231.602950103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.861193 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.861375 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-utilities\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.861487 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-catalog-content\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.861555 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grznq\" (UniqueName: \"kubernetes.io/projected/ccd7b53d-726b-444f-be0f-4eb2655eb35d-kube-api-access-grznq\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.861759 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.361715562 +0000 UTC m=+231.603596170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.962216 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.962436 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grznq\" (UniqueName: \"kubernetes.io/projected/ccd7b53d-726b-444f-be0f-4eb2655eb35d-kube-api-access-grznq\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.962544 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-utilities\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.962606 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-catalog-content\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: E0310 21:54:03.962961 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.462944653 +0000 UTC m=+231.704825261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.963532 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-catalog-content\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.963629 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-utilities\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.972341 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s8qvz"] Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.973244 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.983104 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.987584 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8qvz"] Mar 10 21:54:03 crc kubenswrapper[4919]: I0310 21:54:03.998653 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grznq\" (UniqueName: \"kubernetes.io/projected/ccd7b53d-726b-444f-be0f-4eb2655eb35d-kube-api-access-grznq\") pod \"community-operators-pw22n\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.070816 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbtv9\" (UniqueName: \"kubernetes.io/projected/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-kube-api-access-cbtv9\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.070954 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-utilities\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.071019 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-catalog-content\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.071047 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.071537 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.571518543 +0000 UTC m=+231.813399151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.139732 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.174053 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.174459 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-utilities\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.174506 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-catalog-content\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.174556 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbtv9\" (UniqueName: \"kubernetes.io/projected/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-kube-api-access-cbtv9\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.174875 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.674860892 +0000 UTC m=+231.916741500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.175204 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-utilities\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.175431 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-catalog-content\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.182187 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bx4lk"] Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.183102 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.195976 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bx4lk"] Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.199038 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbtv9\" (UniqueName: \"kubernetes.io/projected/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-kube-api-access-cbtv9\") pod \"certified-operators-s8qvz\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.245255 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58574: no serving certificate available for the kubelet" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.267945 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qrdwk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.275710 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-catalog-content\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.275793 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-utilities\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.275825 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.276008 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7mn\" (UniqueName: \"kubernetes.io/projected/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-kube-api-access-4g7mn\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.280556 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.780535614 +0000 UTC m=+232.022416212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.285377 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.297707 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nwch" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.381108 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.381279 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-catalog-content\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.381612 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-utilities\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.381671 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7mn\" (UniqueName: \"kubernetes.io/projected/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-kube-api-access-4g7mn\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.382053 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.882038281 +0000 UTC m=+232.123918889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.382727 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-catalog-content\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.382921 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-utilities\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.411547 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-74hb6"] Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.448820 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7mn\" (UniqueName: \"kubernetes.io/projected/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-kube-api-access-4g7mn\") pod \"community-operators-bx4lk\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.449087 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.488555 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-utilities\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.488883 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.489258 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-catalog-content\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.489402 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz2bm\" (UniqueName: \"kubernetes.io/projected/a979e53c-0904-4fc0-9ef4-16706a351785-kube-api-access-sz2bm\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.489905 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:04.989885752 +0000 UTC m=+232.231766360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.515776 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74hb6"] Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.568599 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.594900 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.595060 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-utilities\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.595119 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-catalog-content\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.595162 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz2bm\" (UniqueName: \"kubernetes.io/projected/a979e53c-0904-4fc0-9ef4-16706a351785-kube-api-access-sz2bm\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.597943 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.097925488 +0000 UTC m=+232.339806096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.598274 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-utilities\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.599926 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-catalog-content\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.634675 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz2bm\" (UniqueName: \"kubernetes.io/projected/a979e53c-0904-4fc0-9ef4-16706a351785-kube-api-access-sz2bm\") pod \"certified-operators-74hb6\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.674562 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:04 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:04 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:04 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.674619 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.696022 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.696690 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.196671601 +0000 UTC m=+232.438552209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.797223 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.797649 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.297631365 +0000 UTC m=+232.539511973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.893787 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.899016 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:04 crc kubenswrapper[4919]: E0310 21:54:04.899360 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.399347609 +0000 UTC m=+232.641228217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.913439 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pw22n"] Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.928560 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zbds9"] Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.928787 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" podUID="d809a4c1-5e06-46b7-a39c-466b694361ce" containerName="controller-manager" containerID="cri-o://7daa24c1b8a8fe8086f04d6485368449fdbcfc28d3dd69f60d84d54068cb501f" gracePeriod=30 Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.944073 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6"] Mar 10 21:54:04 crc kubenswrapper[4919]: I0310 21:54:04.944791 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" podUID="492e2f21-90e0-4073-b4ad-b562bcf62486" containerName="route-controller-manager" containerID="cri-o://82736eb7c81aad2652a4c08aa003a166ebbe6309ec0b3d4bce360792e944093d" gracePeriod=30 Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.002076 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.50205926 +0000 UTC m=+232.743939868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.001811 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.004236 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.005083 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.505072752 +0000 UTC m=+232.746953360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.051383 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bx4lk"] Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.057654 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8qvz"] Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.118148 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.118492 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.618466163 +0000 UTC m=+232.860346771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.202479 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.222221 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.222536 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.7225231 +0000 UTC m=+232.964403708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.268913 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw22n" event={"ID":"ccd7b53d-726b-444f-be0f-4eb2655eb35d","Type":"ContainerStarted","Data":"f4a975618ed9f405ef7cc99d497ec3e1aaee50b9ea6a3215811b178c30828a92"} Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.271028 4919 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.271060 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx4lk" event={"ID":"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c","Type":"ContainerStarted","Data":"2431ebd0434b16faef3c5e6f004040d9545a75d04c022bbfb7669d4e1432d044"} Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.272893 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" event={"ID":"55ff2223-69b6-4b72-9413-fce0c37ae2b2","Type":"ContainerDied","Data":"d9950d42990d6edde09e55358f5f86128c06d97f8ca78c8a2b63d88cf90f7d6d"} Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.272932 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9950d42990d6edde09e55358f5f86128c06d97f8ca78c8a2b63d88cf90f7d6d" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.273050 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.276092 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" event={"ID":"62cf0e58-2480-43ff-a9aa-f8543fefd9f9","Type":"ContainerStarted","Data":"80d189c910a36c1af04fc387721a23536a519842ab7da8e0a2e9c7bfcfeb402c"} Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.277526 4919 generic.go:334] "Generic (PLEG): container finished" podID="d809a4c1-5e06-46b7-a39c-466b694361ce" containerID="7daa24c1b8a8fe8086f04d6485368449fdbcfc28d3dd69f60d84d54068cb501f" exitCode=0 Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.277582 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" event={"ID":"d809a4c1-5e06-46b7-a39c-466b694361ce","Type":"ContainerDied","Data":"7daa24c1b8a8fe8086f04d6485368449fdbcfc28d3dd69f60d84d54068cb501f"} Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.278758 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8qvz" event={"ID":"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35","Type":"ContainerStarted","Data":"4fa9169080c49963d92bfa35ee923d40cf27f01251463c2e09e01ccb6c22d580"} Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.280476 4919 generic.go:334] "Generic (PLEG): container finished" podID="492e2f21-90e0-4073-b4ad-b562bcf62486" containerID="82736eb7c81aad2652a4c08aa003a166ebbe6309ec0b3d4bce360792e944093d" exitCode=0 Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.280984 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" event={"ID":"492e2f21-90e0-4073-b4ad-b562bcf62486","Type":"ContainerDied","Data":"82736eb7c81aad2652a4c08aa003a166ebbe6309ec0b3d4bce360792e944093d"} Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.322876 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume\") pod \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.323082 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume\") pod \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.323211 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.323247 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58cpc\" (UniqueName: \"kubernetes.io/projected/55ff2223-69b6-4b72-9413-fce0c37ae2b2-kube-api-access-58cpc\") pod \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\" (UID: \"55ff2223-69b6-4b72-9413-fce0c37ae2b2\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.323924 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume" (OuterVolumeSpecName: "config-volume") pod "55ff2223-69b6-4b72-9413-fce0c37ae2b2" (UID: "55ff2223-69b6-4b72-9413-fce0c37ae2b2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.325834 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.825818237 +0000 UTC m=+233.067698845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.342268 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "55ff2223-69b6-4b72-9413-fce0c37ae2b2" (UID: "55ff2223-69b6-4b72-9413-fce0c37ae2b2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.347043 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ff2223-69b6-4b72-9413-fce0c37ae2b2-kube-api-access-58cpc" (OuterVolumeSpecName: "kube-api-access-58cpc") pod "55ff2223-69b6-4b72-9413-fce0c37ae2b2" (UID: "55ff2223-69b6-4b72-9413-fce0c37ae2b2"). InnerVolumeSpecName "kube-api-access-58cpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.426002 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.426203 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55ff2223-69b6-4b72-9413-fce0c37ae2b2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.426222 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55ff2223-69b6-4b72-9413-fce0c37ae2b2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.426235 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58cpc\" (UniqueName: \"kubernetes.io/projected/55ff2223-69b6-4b72-9413-fce0c37ae2b2-kube-api-access-58cpc\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.426717 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:05.926699069 +0000 UTC m=+233.168579677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.479201 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74hb6"] Mar 10 21:54:05 crc kubenswrapper[4919]: W0310 21:54:05.513982 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda979e53c_0904_4fc0_9ef4_16706a351785.slice/crio-b33a730df7ddb558fae74044065c6398a2adf5d959ddeb4b1104ee7d1c126c8f WatchSource:0}: Error finding container b33a730df7ddb558fae74044065c6398a2adf5d959ddeb4b1104ee7d1c126c8f: Status 404 returned error can't find the container with id b33a730df7ddb558fae74044065c6398a2adf5d959ddeb4b1104ee7d1c126c8f Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.523027 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.526262 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.526650 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.527230 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:06.027178139 +0000 UTC m=+233.269058937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.527798 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:06.027778255 +0000 UTC m=+233.269658863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.529456 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.553473 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58578: no serving certificate available for the kubelet" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.632863 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-client-ca\") pod \"d809a4c1-5e06-46b7-a39c-466b694361ce\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.633230 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-client-ca\") pod \"492e2f21-90e0-4073-b4ad-b562bcf62486\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.633255 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d809a4c1-5e06-46b7-a39c-466b694361ce-serving-cert\") pod \"d809a4c1-5e06-46b7-a39c-466b694361ce\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.633306 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-config\") pod \"d809a4c1-5e06-46b7-a39c-466b694361ce\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.633360 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckbrj\" (UniqueName: \"kubernetes.io/projected/d809a4c1-5e06-46b7-a39c-466b694361ce-kube-api-access-ckbrj\") pod \"d809a4c1-5e06-46b7-a39c-466b694361ce\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.633383 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thbnh\" (UniqueName: \"kubernetes.io/projected/492e2f21-90e0-4073-b4ad-b562bcf62486-kube-api-access-thbnh\") pod \"492e2f21-90e0-4073-b4ad-b562bcf62486\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.633731 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "d809a4c1-5e06-46b7-a39c-466b694361ce" (UID: "d809a4c1-5e06-46b7-a39c-466b694361ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.634160 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-config\") pod \"492e2f21-90e0-4073-b4ad-b562bcf62486\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.634241 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-proxy-ca-bundles\") pod \"d809a4c1-5e06-46b7-a39c-466b694361ce\" (UID: \"d809a4c1-5e06-46b7-a39c-466b694361ce\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.634285 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/492e2f21-90e0-4073-b4ad-b562bcf62486-serving-cert\") pod \"492e2f21-90e0-4073-b4ad-b562bcf62486\" (UID: \"492e2f21-90e0-4073-b4ad-b562bcf62486\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.634387 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.634378 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-config" (OuterVolumeSpecName: "config") pod "d809a4c1-5e06-46b7-a39c-466b694361ce" (UID: "d809a4c1-5e06-46b7-a39c-466b694361ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.634997 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d809a4c1-5e06-46b7-a39c-466b694361ce" (UID: "d809a4c1-5e06-46b7-a39c-466b694361ce"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.634927 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:06.134905806 +0000 UTC m=+233.376786414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.635637 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.635650 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.635660 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d809a4c1-5e06-46b7-a39c-466b694361ce-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.635689 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-config" (OuterVolumeSpecName: "config") pod "492e2f21-90e0-4073-b4ad-b562bcf62486" (UID: "492e2f21-90e0-4073-b4ad-b562bcf62486"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.635381 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-client-ca" (OuterVolumeSpecName: "client-ca") pod "492e2f21-90e0-4073-b4ad-b562bcf62486" (UID: "492e2f21-90e0-4073-b4ad-b562bcf62486"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.642288 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/492e2f21-90e0-4073-b4ad-b562bcf62486-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "492e2f21-90e0-4073-b4ad-b562bcf62486" (UID: "492e2f21-90e0-4073-b4ad-b562bcf62486"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.642665 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d809a4c1-5e06-46b7-a39c-466b694361ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d809a4c1-5e06-46b7-a39c-466b694361ce" (UID: "d809a4c1-5e06-46b7-a39c-466b694361ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.644028 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d809a4c1-5e06-46b7-a39c-466b694361ce-kube-api-access-ckbrj" (OuterVolumeSpecName: "kube-api-access-ckbrj") pod "d809a4c1-5e06-46b7-a39c-466b694361ce" (UID: "d809a4c1-5e06-46b7-a39c-466b694361ce"). InnerVolumeSpecName "kube-api-access-ckbrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.646135 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492e2f21-90e0-4073-b4ad-b562bcf62486-kube-api-access-thbnh" (OuterVolumeSpecName: "kube-api-access-thbnh") pod "492e2f21-90e0-4073-b4ad-b562bcf62486" (UID: "492e2f21-90e0-4073-b4ad-b562bcf62486"). InnerVolumeSpecName "kube-api-access-thbnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.668485 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:05 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:05 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:05 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.668535 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.737855 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.738151 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckbrj\" (UniqueName: \"kubernetes.io/projected/d809a4c1-5e06-46b7-a39c-466b694361ce-kube-api-access-ckbrj\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.738198 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thbnh\" (UniqueName: \"kubernetes.io/projected/492e2f21-90e0-4073-b4ad-b562bcf62486-kube-api-access-thbnh\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.738231 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.738247 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:06.238230884 +0000 UTC m=+233.480111492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.738284 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/492e2f21-90e0-4073-b4ad-b562bcf62486-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.738298 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/492e2f21-90e0-4073-b4ad-b562bcf62486-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.738307 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d809a4c1-5e06-46b7-a39c-466b694361ce-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.775916 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lnq5q"] Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.776377 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492e2f21-90e0-4073-b4ad-b562bcf62486" containerName="route-controller-manager" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.776521 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="492e2f21-90e0-4073-b4ad-b562bcf62486" containerName="route-controller-manager" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.776583 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d809a4c1-5e06-46b7-a39c-466b694361ce" containerName="controller-manager" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.776646 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d809a4c1-5e06-46b7-a39c-466b694361ce" containerName="controller-manager" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.776699 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ff2223-69b6-4b72-9413-fce0c37ae2b2" containerName="collect-profiles" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.776747 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ff2223-69b6-4b72-9413-fce0c37ae2b2" containerName="collect-profiles" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.776898 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="492e2f21-90e0-4073-b4ad-b562bcf62486" containerName="route-controller-manager" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.776971 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d809a4c1-5e06-46b7-a39c-466b694361ce" containerName="controller-manager" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.777032 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ff2223-69b6-4b72-9413-fce0c37ae2b2" containerName="collect-profiles" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.777787 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.784612 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnq5q"] Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.785186 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.839630 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.839995 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pcz\" (UniqueName: \"kubernetes.io/projected/df08dbe0-09b0-4d23-b99b-95b65818f84e-kube-api-access-88pcz\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.840067 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-utilities\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.840162 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:06.340124952 +0000 UTC m=+233.582005570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.840345 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-catalog-content\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.840565 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.840919 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:06.340908754 +0000 UTC m=+233.582789372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.942232 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.942475 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 21:54:06.442443643 +0000 UTC m=+233.684324271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.942607 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.942684 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pcz\" (UniqueName: \"kubernetes.io/projected/df08dbe0-09b0-4d23-b99b-95b65818f84e-kube-api-access-88pcz\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.942741 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-utilities\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.942829 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-catalog-content\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.943372 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-catalog-content\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.943304 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-utilities\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: E0310 21:54:05.942971 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 21:54:06.442960147 +0000 UTC m=+233.684840765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lxrsj" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.960686 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pcz\" (UniqueName: \"kubernetes.io/projected/df08dbe0-09b0-4d23-b99b-95b65818f84e-kube-api-access-88pcz\") pod \"redhat-marketplace-lnq5q\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.980872 4919 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T21:54:05.271053029Z","Handler":null,"Name":""} Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.986054 4919 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 21:54:05 crc kubenswrapper[4919]: I0310 21:54:05.986089 4919 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.044844 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.048309 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.104881 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.146041 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.148581 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.148620 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.171929 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gd58j"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.173256 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.182874 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd58j"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.183381 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lxrsj\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.247483 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-catalog-content\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.247523 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gzzm\" (UniqueName: \"kubernetes.io/projected/0c5f7639-6abe-4578-81f0-17691f1ad5ef-kube-api-access-4gzzm\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.247561 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-utilities\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.294064 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" event={"ID":"62cf0e58-2480-43ff-a9aa-f8543fefd9f9","Type":"ContainerStarted","Data":"33fc949cb04fe5c126f95c9e44acebab5536aaab5c08e5481ca16390e8ee348a"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.294108 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" event={"ID":"62cf0e58-2480-43ff-a9aa-f8543fefd9f9","Type":"ContainerStarted","Data":"f84e74f8a913fa6c7266f7d17824f56a9bc17ae8926ad102dcb99225c4a5e0b9"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.296304 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" event={"ID":"d809a4c1-5e06-46b7-a39c-466b694361ce","Type":"ContainerDied","Data":"8f224cce93fd6990db903a4372ed65a00bf121c1cb5fc85d994d71b934e7eb0d"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.296332 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zbds9" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.296369 4919 scope.go:117] "RemoveContainer" containerID="7daa24c1b8a8fe8086f04d6485368449fdbcfc28d3dd69f60d84d54068cb501f" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.313358 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-nv6qp" podStartSLOduration=11.313342851 podStartE2EDuration="11.313342851s" podCreationTimestamp="2026-03-10 21:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:06.311886872 +0000 UTC m=+233.553767500" watchObservedRunningTime="2026-03-10 21:54:06.313342851 +0000 UTC m=+233.555223459" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.315543 4919 generic.go:334] "Generic (PLEG): container finished" podID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerID="06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e" exitCode=0 Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.315618 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8qvz" event={"ID":"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35","Type":"ContainerDied","Data":"06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.321581 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnq5q"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.322583 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" event={"ID":"492e2f21-90e0-4073-b4ad-b562bcf62486","Type":"ContainerDied","Data":"b822a29adc63b2ffc38354a6a6a9c1220e089b9bf21baafb801c26afc300396f"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.322665 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.324587 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zbds9"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.326733 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zbds9"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.329757 4919 generic.go:334] "Generic (PLEG): container finished" podID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerID="b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55" exitCode=0 Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.329803 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw22n" event={"ID":"ccd7b53d-726b-444f-be0f-4eb2655eb35d","Type":"ContainerDied","Data":"b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.334103 4919 generic.go:334] "Generic (PLEG): container finished" podID="a979e53c-0904-4fc0-9ef4-16706a351785" containerID="a660cc17c504e1444f4dd457f548fffa6b0b5534922fc54ba3f482c9a3bca649" exitCode=0 Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.334166 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74hb6" event={"ID":"a979e53c-0904-4fc0-9ef4-16706a351785","Type":"ContainerDied","Data":"a660cc17c504e1444f4dd457f548fffa6b0b5534922fc54ba3f482c9a3bca649"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.334184 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74hb6" event={"ID":"a979e53c-0904-4fc0-9ef4-16706a351785","Type":"ContainerStarted","Data":"b33a730df7ddb558fae74044065c6398a2adf5d959ddeb4b1104ee7d1c126c8f"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.335519 4919 generic.go:334] "Generic (PLEG): container finished" podID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerID="bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377" exitCode=0 Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.336225 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx4lk" event={"ID":"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c","Type":"ContainerDied","Data":"bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377"} Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.342493 4919 scope.go:117] "RemoveContainer" containerID="82736eb7c81aad2652a4c08aa003a166ebbe6309ec0b3d4bce360792e944093d" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.348645 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-catalog-content\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.348689 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gzzm\" (UniqueName: \"kubernetes.io/projected/0c5f7639-6abe-4578-81f0-17691f1ad5ef-kube-api-access-4gzzm\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.348722 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-utilities\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.349214 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-utilities\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.350125 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-catalog-content\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.368556 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gzzm\" (UniqueName: \"kubernetes.io/projected/0c5f7639-6abe-4578-81f0-17691f1ad5ef-kube-api-access-4gzzm\") pod \"redhat-marketplace-gd58j\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.377208 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.409369 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.411784 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ls7k6"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.510539 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.607497 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lxrsj"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.614911 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.615556 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.618113 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.618358 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 21:54:06 crc kubenswrapper[4919]: W0310 21:54:06.621153 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8383a8d8_69ec_4706_8ea3_99ce91e5200c.slice/crio-34f11be4efe908c491f9f4d1c271e148443346071203a63fc067bb252478c2d8 WatchSource:0}: Error finding container 34f11be4efe908c491f9f4d1c271e148443346071203a63fc067bb252478c2d8: Status 404 returned error can't find the container with id 34f11be4efe908c491f9f4d1c271e148443346071203a63fc067bb252478c2d8 Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.626443 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.653414 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.653520 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.666734 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:06 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:06 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:06 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.666992 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.755118 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.755232 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.755346 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.780265 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.786115 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.789739 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.792462 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.793466 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.794483 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.794651 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.794787 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.794831 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.795131 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.795171 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.795235 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.795489 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.796655 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.796827 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.797311 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.798019 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.798194 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.799841 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5"] Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.808617 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.856950 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/683e42c0-cac1-4698-b66c-c2f75a53f388-serving-cert\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.857003 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhzfq\" (UniqueName: \"kubernetes.io/projected/a2559788-98b3-4c30-8959-25b0ceb594a5-kube-api-access-rhzfq\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.857043 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-config\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.857071 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-client-ca\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.857093 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-config\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.857144 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdqxs\" (UniqueName: \"kubernetes.io/projected/683e42c0-cac1-4698-b66c-c2f75a53f388-kube-api-access-kdqxs\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.857173 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-proxy-ca-bundles\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.857245 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2559788-98b3-4c30-8959-25b0ceb594a5-serving-cert\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.857279 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-client-ca\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.938910 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.940084 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd58j"] Mar 10 21:54:06 crc kubenswrapper[4919]: W0310 21:54:06.953798 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5f7639_6abe_4578_81f0_17691f1ad5ef.slice/crio-477a30eb110821e742b920f7a6eca55e0ffcb85c3897be76e5867cea56215a7c WatchSource:0}: Error finding container 477a30eb110821e742b920f7a6eca55e0ffcb85c3897be76e5867cea56215a7c: Status 404 returned error can't find the container with id 477a30eb110821e742b920f7a6eca55e0ffcb85c3897be76e5867cea56215a7c Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.958614 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/683e42c0-cac1-4698-b66c-c2f75a53f388-serving-cert\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.958694 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhzfq\" (UniqueName: \"kubernetes.io/projected/a2559788-98b3-4c30-8959-25b0ceb594a5-kube-api-access-rhzfq\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.958739 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-config\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.958799 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-client-ca\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.958828 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-config\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.958883 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdqxs\" (UniqueName: \"kubernetes.io/projected/683e42c0-cac1-4698-b66c-c2f75a53f388-kube-api-access-kdqxs\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.958930 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-proxy-ca-bundles\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.959004 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2559788-98b3-4c30-8959-25b0ceb594a5-serving-cert\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.959038 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-client-ca\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.960424 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-client-ca\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.960639 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-config\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.961330 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-client-ca\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.961692 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-proxy-ca-bundles\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.961817 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-config\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.963139 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2559788-98b3-4c30-8959-25b0ceb594a5-serving-cert\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.966509 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/683e42c0-cac1-4698-b66c-c2f75a53f388-serving-cert\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.979268 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdqxs\" (UniqueName: \"kubernetes.io/projected/683e42c0-cac1-4698-b66c-c2f75a53f388-kube-api-access-kdqxs\") pod \"controller-manager-76b5f4f6f4-cchk5\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:06 crc kubenswrapper[4919]: I0310 21:54:06.979990 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhzfq\" (UniqueName: \"kubernetes.io/projected/a2559788-98b3-4c30-8959-25b0ceb594a5-kube-api-access-rhzfq\") pod \"route-controller-manager-6c7cb9d945-w94dp\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.116937 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.124160 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.165276 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.170496 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f4nt4"] Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.171558 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.174875 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.179774 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f4nt4"] Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.266614 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-catalog-content\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.266689 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgvkg\" (UniqueName: \"kubernetes.io/projected/28b0abdd-217d-42f6-80fb-b270be44700e-kube-api-access-bgvkg\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.266914 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-utilities\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.312063 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.313021 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.325831 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.368420 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-utilities\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.368556 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-catalog-content\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.368618 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgvkg\" (UniqueName: \"kubernetes.io/projected/28b0abdd-217d-42f6-80fb-b270be44700e-kube-api-access-bgvkg\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.369113 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-utilities\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.370745 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-catalog-content\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.398690 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgvkg\" (UniqueName: \"kubernetes.io/projected/28b0abdd-217d-42f6-80fb-b270be44700e-kube-api-access-bgvkg\") pod \"redhat-operators-f4nt4\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.407156 4919 generic.go:334] "Generic (PLEG): container finished" podID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerID="8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1" exitCode=0 Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.407232 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnq5q" event={"ID":"df08dbe0-09b0-4d23-b99b-95b65818f84e","Type":"ContainerDied","Data":"8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1"} Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.407259 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnq5q" event={"ID":"df08dbe0-09b0-4d23-b99b-95b65818f84e","Type":"ContainerStarted","Data":"392945f53aec564b04401d2f253b4e68fe7aa032fac26801a08555b4f158ad50"} Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.410992 4919 generic.go:334] "Generic (PLEG): container finished" podID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerID="9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e" exitCode=0 Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.411103 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd58j" event={"ID":"0c5f7639-6abe-4578-81f0-17691f1ad5ef","Type":"ContainerDied","Data":"9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e"} Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.411139 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd58j" event={"ID":"0c5f7639-6abe-4578-81f0-17691f1ad5ef","Type":"ContainerStarted","Data":"477a30eb110821e742b920f7a6eca55e0ffcb85c3897be76e5867cea56215a7c"} Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.414100 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"07397f12-36c1-45ae-9d6d-e51e344cf3bf","Type":"ContainerStarted","Data":"0e5f22ca5ab49565f40bea53ef5cc3a2c56c9089b9dfe6fae06b6b54f8a3be27"} Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.418437 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" event={"ID":"8383a8d8-69ec-4706-8ea3-99ce91e5200c","Type":"ContainerStarted","Data":"6e025e0257d442e99fc3a627cf5828eae99cb82fa63b9f3eb91087181f10f013"} Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.418470 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" event={"ID":"8383a8d8-69ec-4706-8ea3-99ce91e5200c","Type":"ContainerStarted","Data":"34f11be4efe908c491f9f4d1c271e148443346071203a63fc067bb252478c2d8"} Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.418688 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.431002 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fb6tt" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.466111 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" podStartSLOduration=172.466092626 podStartE2EDuration="2m52.466092626s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:07.463727632 +0000 UTC m=+234.705608240" watchObservedRunningTime="2026-03-10 21:54:07.466092626 +0000 UTC m=+234.707973234" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.487461 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="492e2f21-90e0-4073-b4ad-b562bcf62486" path="/var/lib/kubelet/pods/492e2f21-90e0-4073-b4ad-b562bcf62486/volumes" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.488375 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.502896 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d809a4c1-5e06-46b7-a39c-466b694361ce" path="/var/lib/kubelet/pods/d809a4c1-5e06-46b7-a39c-466b694361ce/volumes" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.537292 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.600020 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4l8gq"] Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.601439 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.618493 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4l8gq"] Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.625150 4919 patch_prober.go:28] interesting pod/downloads-7954f5f757-49z6f container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.625210 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-49z6f" podUID="fb7623ea-ec67-4061-82a6-4099e52fa3b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.626318 4919 patch_prober.go:28] interesting pod/downloads-7954f5f757-49z6f container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.626342 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-49z6f" podUID="fb7623ea-ec67-4061-82a6-4099e52fa3b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.670685 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.670724 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.670736 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.684829 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5"] Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.685052 4919 patch_prober.go:28] interesting pod/console-f9d7485db-58nxf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.685131 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-58nxf" podUID="9657873d-9275-4945-9e91-0b2c2844ae5d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.703551 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn87z\" (UniqueName: \"kubernetes.io/projected/e2db42cd-0c43-41be-a881-199c82f703bd-kube-api-access-wn87z\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.703597 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-utilities\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.703676 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-catalog-content\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.714701 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:07 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:07 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:07 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.714767 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.744282 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp"] Mar 10 21:54:07 crc kubenswrapper[4919]: W0310 21:54:07.789240 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod683e42c0_cac1_4698_b66c_c2f75a53f388.slice/crio-6a2e367ad449083bc4a89e6d76a42fe401c6a828d02c9f58a5c9f77911d80dc5 WatchSource:0}: Error finding container 6a2e367ad449083bc4a89e6d76a42fe401c6a828d02c9f58a5c9f77911d80dc5: Status 404 returned error can't find the container with id 6a2e367ad449083bc4a89e6d76a42fe401c6a828d02c9f58a5c9f77911d80dc5 Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.808778 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn87z\" (UniqueName: \"kubernetes.io/projected/e2db42cd-0c43-41be-a881-199c82f703bd-kube-api-access-wn87z\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.808827 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-utilities\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.808867 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-catalog-content\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.809365 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-utilities\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.809466 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-catalog-content\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.833233 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn87z\" (UniqueName: \"kubernetes.io/projected/e2db42cd-0c43-41be-a881-199c82f703bd-kube-api-access-wn87z\") pod \"redhat-operators-4l8gq\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:07 crc kubenswrapper[4919]: I0310 21:54:07.933014 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.134668 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58588: no serving certificate available for the kubelet" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.343964 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f4nt4"] Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.449226 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" event={"ID":"683e42c0-cac1-4698-b66c-c2f75a53f388","Type":"ContainerStarted","Data":"e21b3baef7a6eb690aa102df2c67fd107710166d35eac4361de59904d5868d20"} Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.449791 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" event={"ID":"683e42c0-cac1-4698-b66c-c2f75a53f388","Type":"ContainerStarted","Data":"6a2e367ad449083bc4a89e6d76a42fe401c6a828d02c9f58a5c9f77911d80dc5"} Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.451472 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.457546 4919 generic.go:334] "Generic (PLEG): container finished" podID="07397f12-36c1-45ae-9d6d-e51e344cf3bf" containerID="7099388b6b0fc25b3438ad5a59fc5f384bb925ef3589399bb4a34e69def28c02" exitCode=0 Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.457775 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"07397f12-36c1-45ae-9d6d-e51e344cf3bf","Type":"ContainerDied","Data":"7099388b6b0fc25b3438ad5a59fc5f384bb925ef3589399bb4a34e69def28c02"} Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.474225 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" event={"ID":"a2559788-98b3-4c30-8959-25b0ceb594a5","Type":"ContainerStarted","Data":"850124690cfa9ed81e7274bb1be644d11f220a650a0b8912069775db1723934c"} Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.475132 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.491634 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" podStartSLOduration=3.491615543 podStartE2EDuration="3.491615543s" podCreationTimestamp="2026-03-10 21:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:08.472532885 +0000 UTC m=+235.714413493" watchObservedRunningTime="2026-03-10 21:54:08.491615543 +0000 UTC m=+235.733496141" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.666517 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:08 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:08 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:08 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.666569 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.869102 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.869922 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.877863 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.880051 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.885257 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.926193 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fff766-35dd-471f-a73b-12d85f5845b0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"37fff766-35dd-471f-a73b-12d85f5845b0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:08 crc kubenswrapper[4919]: I0310 21:54:08.926334 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fff766-35dd-471f-a73b-12d85f5845b0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"37fff766-35dd-471f-a73b-12d85f5845b0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:09 crc kubenswrapper[4919]: I0310 21:54:09.027426 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fff766-35dd-471f-a73b-12d85f5845b0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"37fff766-35dd-471f-a73b-12d85f5845b0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:09 crc kubenswrapper[4919]: I0310 21:54:09.027495 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fff766-35dd-471f-a73b-12d85f5845b0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"37fff766-35dd-471f-a73b-12d85f5845b0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:09 crc kubenswrapper[4919]: I0310 21:54:09.027822 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fff766-35dd-471f-a73b-12d85f5845b0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"37fff766-35dd-471f-a73b-12d85f5845b0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:09 crc kubenswrapper[4919]: I0310 21:54:09.061370 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fff766-35dd-471f-a73b-12d85f5845b0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"37fff766-35dd-471f-a73b-12d85f5845b0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:09 crc kubenswrapper[4919]: I0310 21:54:09.080734 4919 ???:1] "http: TLS handshake error from 192.168.126.11:58592: no serving certificate available for the kubelet" Mar 10 21:54:09 crc kubenswrapper[4919]: I0310 21:54:09.218688 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:09 crc kubenswrapper[4919]: I0310 21:54:09.666938 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:09 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:09 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:09 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:09 crc kubenswrapper[4919]: I0310 21:54:09.666999 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:10 crc kubenswrapper[4919]: I0310 21:54:10.665540 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:10 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:10 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:10 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:10 crc kubenswrapper[4919]: I0310 21:54:10.665805 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:11 crc kubenswrapper[4919]: I0310 21:54:11.667554 4919 patch_prober.go:28] interesting pod/router-default-5444994796-hmjhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 21:54:11 crc kubenswrapper[4919]: [-]has-synced failed: reason withheld Mar 10 21:54:11 crc kubenswrapper[4919]: [+]process-running ok Mar 10 21:54:11 crc kubenswrapper[4919]: healthz check failed Mar 10 21:54:11 crc kubenswrapper[4919]: I0310 21:54:11.667875 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hmjhm" podUID="ca9a516e-afc2-4475-8af8-23504d17f9a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 21:54:12 crc kubenswrapper[4919]: I0310 21:54:12.666094 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:54:12 crc kubenswrapper[4919]: I0310 21:54:12.669019 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hmjhm" Mar 10 21:54:13 crc kubenswrapper[4919]: I0310 21:54:13.126070 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-26f88" Mar 10 21:54:13 crc kubenswrapper[4919]: I0310 21:54:13.279157 4919 ???:1] "http: TLS handshake error from 192.168.126.11:60538: no serving certificate available for the kubelet" Mar 10 21:54:14 crc kubenswrapper[4919]: W0310 21:54:14.319503 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28b0abdd_217d_42f6_80fb_b270be44700e.slice/crio-f510b9e8c3ce17c9809ee1c110751a266af6add07e24854fbeb29359742a2325 WatchSource:0}: Error finding container f510b9e8c3ce17c9809ee1c110751a266af6add07e24854fbeb29359742a2325: Status 404 returned error can't find the container with id f510b9e8c3ce17c9809ee1c110751a266af6add07e24854fbeb29359742a2325 Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.517455 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4nt4" event={"ID":"28b0abdd-217d-42f6-80fb-b270be44700e","Type":"ContainerStarted","Data":"f510b9e8c3ce17c9809ee1c110751a266af6add07e24854fbeb29359742a2325"} Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.521667 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" event={"ID":"a2559788-98b3-4c30-8959-25b0ceb594a5","Type":"ContainerStarted","Data":"2532eabb47c9d3e36269dc0fa91947417402628bd25ca1ca4484cb4b1b2bfe72"} Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.522208 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.810466 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.825895 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" podStartSLOduration=9.825876698 podStartE2EDuration="9.825876698s" podCreationTimestamp="2026-03-10 21:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:14.538718055 +0000 UTC m=+241.780598663" watchObservedRunningTime="2026-03-10 21:54:14.825876698 +0000 UTC m=+242.067757306" Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.918663 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kube-api-access\") pod \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\" (UID: \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\") " Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.918721 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kubelet-dir\") pod \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\" (UID: \"07397f12-36c1-45ae-9d6d-e51e344cf3bf\") " Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.918893 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "07397f12-36c1-45ae-9d6d-e51e344cf3bf" (UID: "07397f12-36c1-45ae-9d6d-e51e344cf3bf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.919157 4919 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:14 crc kubenswrapper[4919]: I0310 21:54:14.927014 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "07397f12-36c1-45ae-9d6d-e51e344cf3bf" (UID: "07397f12-36c1-45ae-9d6d-e51e344cf3bf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:15 crc kubenswrapper[4919]: I0310 21:54:15.017316 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:15 crc kubenswrapper[4919]: I0310 21:54:15.020038 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07397f12-36c1-45ae-9d6d-e51e344cf3bf-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:15 crc kubenswrapper[4919]: I0310 21:54:15.528843 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 21:54:15 crc kubenswrapper[4919]: I0310 21:54:15.528874 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"07397f12-36c1-45ae-9d6d-e51e344cf3bf","Type":"ContainerDied","Data":"0e5f22ca5ab49565f40bea53ef5cc3a2c56c9089b9dfe6fae06b6b54f8a3be27"} Mar 10 21:54:15 crc kubenswrapper[4919]: I0310 21:54:15.528938 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e5f22ca5ab49565f40bea53ef5cc3a2c56c9089b9dfe6fae06b6b54f8a3be27" Mar 10 21:54:17 crc kubenswrapper[4919]: I0310 21:54:17.644219 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-49z6f" Mar 10 21:54:17 crc kubenswrapper[4919]: I0310 21:54:17.665846 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:54:17 crc kubenswrapper[4919]: I0310 21:54:17.675136 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-58nxf" Mar 10 21:54:19 crc kubenswrapper[4919]: I0310 21:54:19.718108 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:54:19 crc kubenswrapper[4919]: I0310 21:54:19.723116 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 21:54:19 crc kubenswrapper[4919]: I0310 21:54:19.740521 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a95e8b73-ffed-4248-b8ba-99fc7c5b900f-metrics-certs\") pod \"network-metrics-daemon-ckwhl\" (UID: \"a95e8b73-ffed-4248-b8ba-99fc7c5b900f\") " pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:54:19 crc kubenswrapper[4919]: I0310 21:54:19.807622 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 21:54:19 crc kubenswrapper[4919]: I0310 21:54:19.816262 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckwhl" Mar 10 21:54:23 crc kubenswrapper[4919]: E0310 21:54:23.347156 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 21:54:23 crc kubenswrapper[4919]: E0310 21:54:23.347596 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 21:54:23 crc kubenswrapper[4919]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 21:54:23 crc kubenswrapper[4919]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rp685,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552994-rvxmh_openshift-infra(cac0bc08-6186-43fb-bebe-036c98331599): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 21:54:23 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 21:54:23 crc kubenswrapper[4919]: E0310 21:54:23.348809 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" podUID="cac0bc08-6186-43fb-bebe-036c98331599" Mar 10 21:54:23 crc kubenswrapper[4919]: E0310 21:54:23.651704 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" podUID="cac0bc08-6186-43fb-bebe-036c98331599" Mar 10 21:54:24 crc kubenswrapper[4919]: I0310 21:54:24.742528 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5"] Mar 10 21:54:24 crc kubenswrapper[4919]: I0310 21:54:24.743118 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" podUID="683e42c0-cac1-4698-b66c-c2f75a53f388" containerName="controller-manager" containerID="cri-o://e21b3baef7a6eb690aa102df2c67fd107710166d35eac4361de59904d5868d20" gracePeriod=30 Mar 10 21:54:24 crc kubenswrapper[4919]: I0310 21:54:24.754187 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp"] Mar 10 21:54:24 crc kubenswrapper[4919]: I0310 21:54:24.754408 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" podUID="a2559788-98b3-4c30-8959-25b0ceb594a5" containerName="route-controller-manager" containerID="cri-o://2532eabb47c9d3e36269dc0fa91947417402628bd25ca1ca4484cb4b1b2bfe72" gracePeriod=30 Mar 10 21:54:24 crc kubenswrapper[4919]: E0310 21:54:24.932605 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2559788_98b3_4c30_8959_25b0ceb594a5.slice/crio-conmon-2532eabb47c9d3e36269dc0fa91947417402628bd25ca1ca4484cb4b1b2bfe72.scope\": RecentStats: unable to find data in memory cache]" Mar 10 21:54:25 crc kubenswrapper[4919]: I0310 21:54:25.585471 4919 generic.go:334] "Generic (PLEG): container finished" podID="683e42c0-cac1-4698-b66c-c2f75a53f388" containerID="e21b3baef7a6eb690aa102df2c67fd107710166d35eac4361de59904d5868d20" exitCode=0 Mar 10 21:54:25 crc kubenswrapper[4919]: I0310 21:54:25.585550 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" event={"ID":"683e42c0-cac1-4698-b66c-c2f75a53f388","Type":"ContainerDied","Data":"e21b3baef7a6eb690aa102df2c67fd107710166d35eac4361de59904d5868d20"} Mar 10 21:54:25 crc kubenswrapper[4919]: I0310 21:54:25.587556 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2559788-98b3-4c30-8959-25b0ceb594a5" containerID="2532eabb47c9d3e36269dc0fa91947417402628bd25ca1ca4484cb4b1b2bfe72" exitCode=0 Mar 10 21:54:25 crc kubenswrapper[4919]: I0310 21:54:25.587579 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" event={"ID":"a2559788-98b3-4c30-8959-25b0ceb594a5","Type":"ContainerDied","Data":"2532eabb47c9d3e36269dc0fa91947417402628bd25ca1ca4484cb4b1b2bfe72"} Mar 10 21:54:26 crc kubenswrapper[4919]: I0310 21:54:26.383352 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:54:27 crc kubenswrapper[4919]: I0310 21:54:27.069462 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4l8gq"] Mar 10 21:54:27 crc kubenswrapper[4919]: I0310 21:54:27.077893 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 21:54:28 crc kubenswrapper[4919]: I0310 21:54:28.118115 4919 patch_prober.go:28] interesting pod/route-controller-manager-6c7cb9d945-w94dp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:54:28 crc kubenswrapper[4919]: I0310 21:54:28.118681 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" podUID="a2559788-98b3-4c30-8959-25b0ceb594a5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:54:28 crc kubenswrapper[4919]: I0310 21:54:28.125440 4919 patch_prober.go:28] interesting pod/controller-manager-76b5f4f6f4-cchk5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:54:28 crc kubenswrapper[4919]: I0310 21:54:28.125498 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" podUID="683e42c0-cac1-4698-b66c-c2f75a53f388" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:54:28 crc kubenswrapper[4919]: E0310 21:54:28.365718 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 21:54:28 crc kubenswrapper[4919]: E0310 21:54:28.365886 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4g7mn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bx4lk_openshift-marketplace(fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 21:54:28 crc kubenswrapper[4919]: E0310 21:54:28.366358 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 21:54:28 crc kubenswrapper[4919]: E0310 21:54:28.366522 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grznq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pw22n_openshift-marketplace(ccd7b53d-726b-444f-be0f-4eb2655eb35d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 21:54:28 crc kubenswrapper[4919]: E0310 21:54:28.367456 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bx4lk" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" Mar 10 21:54:28 crc kubenswrapper[4919]: E0310 21:54:28.368536 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pw22n" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" Mar 10 21:54:29 crc kubenswrapper[4919]: I0310 21:54:29.176001 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 21:54:29 crc kubenswrapper[4919]: I0310 21:54:29.176254 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.038699 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.054741 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbtv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s8qvz_openshift-marketplace(b8a6c263-cf9a-41f9-8ea0-fb07b0596a35): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.056489 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s8qvz" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.081350 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.083511 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.083748 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-74hb6_openshift-marketplace(a979e53c-0904-4fc0-9ef4-16706a351785): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.085139 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-74hb6" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.113603 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.114023 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh"] Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.114660 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683e42c0-cac1-4698-b66c-c2f75a53f388" containerName="controller-manager" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.114676 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="683e42c0-cac1-4698-b66c-c2f75a53f388" containerName="controller-manager" Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.114686 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2559788-98b3-4c30-8959-25b0ceb594a5" containerName="route-controller-manager" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.114694 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2559788-98b3-4c30-8959-25b0ceb594a5" containerName="route-controller-manager" Mar 10 21:54:30 crc kubenswrapper[4919]: E0310 21:54:30.114710 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07397f12-36c1-45ae-9d6d-e51e344cf3bf" containerName="pruner" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.114718 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="07397f12-36c1-45ae-9d6d-e51e344cf3bf" containerName="pruner" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.114891 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="07397f12-36c1-45ae-9d6d-e51e344cf3bf" containerName="pruner" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.114909 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2559788-98b3-4c30-8959-25b0ceb594a5" containerName="route-controller-manager" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.114918 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="683e42c0-cac1-4698-b66c-c2f75a53f388" containerName="controller-manager" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.115472 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.124098 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh"] Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.162674 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ckwhl"] Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.167581 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhzfq\" (UniqueName: \"kubernetes.io/projected/a2559788-98b3-4c30-8959-25b0ceb594a5-kube-api-access-rhzfq\") pod \"a2559788-98b3-4c30-8959-25b0ceb594a5\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.167632 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-client-ca\") pod \"683e42c0-cac1-4698-b66c-c2f75a53f388\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.167660 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-config\") pod \"683e42c0-cac1-4698-b66c-c2f75a53f388\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.167769 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-serving-cert\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.167835 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-config\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.167878 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94shs\" (UniqueName: \"kubernetes.io/projected/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-kube-api-access-94shs\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.167961 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-client-ca\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.168585 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-client-ca" (OuterVolumeSpecName: "client-ca") pod "683e42c0-cac1-4698-b66c-c2f75a53f388" (UID: "683e42c0-cac1-4698-b66c-c2f75a53f388"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.168653 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-config" (OuterVolumeSpecName: "config") pod "683e42c0-cac1-4698-b66c-c2f75a53f388" (UID: "683e42c0-cac1-4698-b66c-c2f75a53f388"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.172784 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2559788-98b3-4c30-8959-25b0ceb594a5-kube-api-access-rhzfq" (OuterVolumeSpecName: "kube-api-access-rhzfq") pod "a2559788-98b3-4c30-8959-25b0ceb594a5" (UID: "a2559788-98b3-4c30-8959-25b0ceb594a5"). InnerVolumeSpecName "kube-api-access-rhzfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268490 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdqxs\" (UniqueName: \"kubernetes.io/projected/683e42c0-cac1-4698-b66c-c2f75a53f388-kube-api-access-kdqxs\") pod \"683e42c0-cac1-4698-b66c-c2f75a53f388\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268527 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-client-ca\") pod \"a2559788-98b3-4c30-8959-25b0ceb594a5\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268560 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/683e42c0-cac1-4698-b66c-c2f75a53f388-serving-cert\") pod \"683e42c0-cac1-4698-b66c-c2f75a53f388\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268578 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-config\") pod \"a2559788-98b3-4c30-8959-25b0ceb594a5\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268608 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-proxy-ca-bundles\") pod \"683e42c0-cac1-4698-b66c-c2f75a53f388\" (UID: \"683e42c0-cac1-4698-b66c-c2f75a53f388\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268627 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2559788-98b3-4c30-8959-25b0ceb594a5-serving-cert\") pod \"a2559788-98b3-4c30-8959-25b0ceb594a5\" (UID: \"a2559788-98b3-4c30-8959-25b0ceb594a5\") " Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268706 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-serving-cert\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268753 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-config\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268780 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94shs\" (UniqueName: \"kubernetes.io/projected/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-kube-api-access-94shs\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268835 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-client-ca\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268888 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhzfq\" (UniqueName: \"kubernetes.io/projected/a2559788-98b3-4c30-8959-25b0ceb594a5-kube-api-access-rhzfq\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268898 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.268907 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.269158 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-client-ca" (OuterVolumeSpecName: "client-ca") pod "a2559788-98b3-4c30-8959-25b0ceb594a5" (UID: "a2559788-98b3-4c30-8959-25b0ceb594a5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.269199 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "683e42c0-cac1-4698-b66c-c2f75a53f388" (UID: "683e42c0-cac1-4698-b66c-c2f75a53f388"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.269614 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-client-ca\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.269892 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-config\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.271409 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-config" (OuterVolumeSpecName: "config") pod "a2559788-98b3-4c30-8959-25b0ceb594a5" (UID: "a2559788-98b3-4c30-8959-25b0ceb594a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.273917 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2559788-98b3-4c30-8959-25b0ceb594a5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a2559788-98b3-4c30-8959-25b0ceb594a5" (UID: "a2559788-98b3-4c30-8959-25b0ceb594a5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.273943 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/683e42c0-cac1-4698-b66c-c2f75a53f388-kube-api-access-kdqxs" (OuterVolumeSpecName: "kube-api-access-kdqxs") pod "683e42c0-cac1-4698-b66c-c2f75a53f388" (UID: "683e42c0-cac1-4698-b66c-c2f75a53f388"). InnerVolumeSpecName "kube-api-access-kdqxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.274994 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-serving-cert\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.280361 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/683e42c0-cac1-4698-b66c-c2f75a53f388-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "683e42c0-cac1-4698-b66c-c2f75a53f388" (UID: "683e42c0-cac1-4698-b66c-c2f75a53f388"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.283088 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94shs\" (UniqueName: \"kubernetes.io/projected/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-kube-api-access-94shs\") pod \"route-controller-manager-6d9c7cf599-w5hkh\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.369541 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.369570 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/683e42c0-cac1-4698-b66c-c2f75a53f388-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.369579 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2559788-98b3-4c30-8959-25b0ceb594a5-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.369589 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/683e42c0-cac1-4698-b66c-c2f75a53f388-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.369598 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2559788-98b3-4c30-8959-25b0ceb594a5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.369610 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdqxs\" (UniqueName: \"kubernetes.io/projected/683e42c0-cac1-4698-b66c-c2f75a53f388-kube-api-access-kdqxs\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.445669 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.611525 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4l8gq" event={"ID":"e2db42cd-0c43-41be-a881-199c82f703bd","Type":"ContainerStarted","Data":"8e130c7ff4578bb6de415aa1ab4f4c9cfac5a99971a54d041724826a529fd399"} Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.613012 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"37fff766-35dd-471f-a73b-12d85f5845b0","Type":"ContainerStarted","Data":"7d7b70e9d254587ba6394813ab5db336d019673e5db6ced154be8e374da1d405"} Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.614351 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" event={"ID":"a2559788-98b3-4c30-8959-25b0ceb594a5","Type":"ContainerDied","Data":"850124690cfa9ed81e7274bb1be644d11f220a650a0b8912069775db1723934c"} Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.614385 4919 scope.go:117] "RemoveContainer" containerID="2532eabb47c9d3e36269dc0fa91947417402628bd25ca1ca4484cb4b1b2bfe72" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.614431 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.615908 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" event={"ID":"683e42c0-cac1-4698-b66c-c2f75a53f388","Type":"ContainerDied","Data":"6a2e367ad449083bc4a89e6d76a42fe401c6a828d02c9f58a5c9f77911d80dc5"} Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.615921 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5" Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.619446 4919 generic.go:334] "Generic (PLEG): container finished" podID="28b0abdd-217d-42f6-80fb-b270be44700e" containerID="3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb" exitCode=0 Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.620328 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4nt4" event={"ID":"28b0abdd-217d-42f6-80fb-b270be44700e","Type":"ContainerDied","Data":"3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb"} Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.677183 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp"] Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.679516 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7cb9d945-w94dp"] Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.687786 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5"] Mar 10 21:54:30 crc kubenswrapper[4919]: I0310 21:54:30.693142 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76b5f4f6f4-cchk5"] Mar 10 21:54:31 crc kubenswrapper[4919]: I0310 21:54:31.486946 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="683e42c0-cac1-4698-b66c-c2f75a53f388" path="/var/lib/kubelet/pods/683e42c0-cac1-4698-b66c-c2f75a53f388/volumes" Mar 10 21:54:31 crc kubenswrapper[4919]: I0310 21:54:31.487920 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2559788-98b3-4c30-8959-25b0ceb594a5" path="/var/lib/kubelet/pods/a2559788-98b3-4c30-8959-25b0ceb594a5/volumes" Mar 10 21:54:31 crc kubenswrapper[4919]: E0310 21:54:31.489977 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-74hb6" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" Mar 10 21:54:31 crc kubenswrapper[4919]: E0310 21:54:31.490026 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bx4lk" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" Mar 10 21:54:31 crc kubenswrapper[4919]: E0310 21:54:31.490033 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pw22n" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" Mar 10 21:54:31 crc kubenswrapper[4919]: E0310 21:54:31.490054 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s8qvz" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" Mar 10 21:54:31 crc kubenswrapper[4919]: I0310 21:54:31.497080 4919 scope.go:117] "RemoveContainer" containerID="e21b3baef7a6eb690aa102df2c67fd107710166d35eac4361de59904d5868d20" Mar 10 21:54:31 crc kubenswrapper[4919]: I0310 21:54:31.651075 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" event={"ID":"a95e8b73-ffed-4248-b8ba-99fc7c5b900f","Type":"ContainerStarted","Data":"79f80c84bb4b074120c42f79c8b7257103f372a0f4c75caa817de1646e491782"} Mar 10 21:54:31 crc kubenswrapper[4919]: I0310 21:54:31.916123 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh"] Mar 10 21:54:31 crc kubenswrapper[4919]: W0310 21:54:31.929640 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec35e4f4_8013_49e0_ad6e_efaede5a41ca.slice/crio-60f6d73f868fa840720239c52f519c36bd0147b7eb48d6c420e27b51b29e5d60 WatchSource:0}: Error finding container 60f6d73f868fa840720239c52f519c36bd0147b7eb48d6c420e27b51b29e5d60: Status 404 returned error can't find the container with id 60f6d73f868fa840720239c52f519c36bd0147b7eb48d6c420e27b51b29e5d60 Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.217018 4919 csr.go:261] certificate signing request csr-5wj4j is approved, waiting to be issued Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.225065 4919 csr.go:257] certificate signing request csr-5wj4j is issued Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.664352 4919 generic.go:334] "Generic (PLEG): container finished" podID="99f42fb7-eaa5-46d2-9443-81ad7a563cec" containerID="55595d6a43e0adabea5d56c99bc7518e004a79b1114f60522b987df8fb6a712c" exitCode=0 Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.664451 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552992-kp6wz" event={"ID":"99f42fb7-eaa5-46d2-9443-81ad7a563cec","Type":"ContainerDied","Data":"55595d6a43e0adabea5d56c99bc7518e004a79b1114f60522b987df8fb6a712c"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.666629 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" event={"ID":"a95e8b73-ffed-4248-b8ba-99fc7c5b900f","Type":"ContainerStarted","Data":"0d43ef4ccc3ea161815d81311fbe64c9b4b1c7cd9881264c06b36f62257fb0b9"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.666656 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckwhl" event={"ID":"a95e8b73-ffed-4248-b8ba-99fc7c5b900f","Type":"ContainerStarted","Data":"381cd469bea04350ba1a7e5f30e83b6986501e3966e827cd281bb51027eb8ca2"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.668234 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" event={"ID":"ec35e4f4-8013-49e0-ad6e-efaede5a41ca","Type":"ContainerStarted","Data":"9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.668268 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" event={"ID":"ec35e4f4-8013-49e0-ad6e-efaede5a41ca","Type":"ContainerStarted","Data":"60f6d73f868fa840720239c52f519c36bd0147b7eb48d6c420e27b51b29e5d60"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.668512 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.670851 4919 generic.go:334] "Generic (PLEG): container finished" podID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerID="b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e" exitCode=0 Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.670929 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnq5q" event={"ID":"df08dbe0-09b0-4d23-b99b-95b65818f84e","Type":"ContainerDied","Data":"b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.673287 4919 generic.go:334] "Generic (PLEG): container finished" podID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerID="b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31" exitCode=0 Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.673444 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd58j" event={"ID":"0c5f7639-6abe-4578-81f0-17691f1ad5ef","Type":"ContainerDied","Data":"b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.682726 4919 generic.go:334] "Generic (PLEG): container finished" podID="e2db42cd-0c43-41be-a881-199c82f703bd" containerID="3959afa3c73c8cd3ff60bc04e866a4d6631d97110554248f6ff8df122b309484" exitCode=0 Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.682793 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4l8gq" event={"ID":"e2db42cd-0c43-41be-a881-199c82f703bd","Type":"ContainerDied","Data":"3959afa3c73c8cd3ff60bc04e866a4d6631d97110554248f6ff8df122b309484"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.686068 4919 generic.go:334] "Generic (PLEG): container finished" podID="37fff766-35dd-471f-a73b-12d85f5845b0" containerID="926cf423bbbde977585cc1477d2071c6029c5057320664640628badac0cfc73a" exitCode=0 Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.686144 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"37fff766-35dd-471f-a73b-12d85f5845b0","Type":"ContainerDied","Data":"926cf423bbbde977585cc1477d2071c6029c5057320664640628badac0cfc73a"} Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.695176 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" podStartSLOduration=8.695159252 podStartE2EDuration="8.695159252s" podCreationTimestamp="2026-03-10 21:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:32.69362374 +0000 UTC m=+259.935504368" watchObservedRunningTime="2026-03-10 21:54:32.695159252 +0000 UTC m=+259.937039860" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.707561 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ckwhl" podStartSLOduration=197.707543698 podStartE2EDuration="3m17.707543698s" podCreationTimestamp="2026-03-10 21:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:32.707050865 +0000 UTC m=+259.948931473" watchObservedRunningTime="2026-03-10 21:54:32.707543698 +0000 UTC m=+259.949424306" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.786682 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.798825 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84779b596d-djnnm"] Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.799506 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.803947 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.804294 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.804507 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.804668 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.804869 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.806604 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.814908 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.816156 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84779b596d-djnnm"] Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.901067 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d63dfa4-4174-484d-a783-cb65de9f56e0-serving-cert\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.901118 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-client-ca\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.901138 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c75jq\" (UniqueName: \"kubernetes.io/projected/3d63dfa4-4174-484d-a783-cb65de9f56e0-kube-api-access-c75jq\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.901193 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-config\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:32 crc kubenswrapper[4919]: I0310 21:54:32.901235 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-proxy-ca-bundles\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.002434 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d63dfa4-4174-484d-a783-cb65de9f56e0-serving-cert\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.002513 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-client-ca\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.002538 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c75jq\" (UniqueName: \"kubernetes.io/projected/3d63dfa4-4174-484d-a783-cb65de9f56e0-kube-api-access-c75jq\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.002617 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-config\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.002668 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-proxy-ca-bundles\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.003913 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-client-ca\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.004317 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-proxy-ca-bundles\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.005334 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-config\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.011476 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d63dfa4-4174-484d-a783-cb65de9f56e0-serving-cert\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.022861 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c75jq\" (UniqueName: \"kubernetes.io/projected/3d63dfa4-4174-484d-a783-cb65de9f56e0-kube-api-access-c75jq\") pod \"controller-manager-84779b596d-djnnm\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.120593 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.226686 4919 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 01:39:32.601202296 +0000 UTC Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.226722 4919 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6171h44m59.37448205s for next certificate rotation Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.975062 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552992-kp6wz" Mar 10 21:54:33 crc kubenswrapper[4919]: I0310 21:54:33.976718 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.116923 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fff766-35dd-471f-a73b-12d85f5845b0-kube-api-access\") pod \"37fff766-35dd-471f-a73b-12d85f5845b0\" (UID: \"37fff766-35dd-471f-a73b-12d85f5845b0\") " Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.116985 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc24k\" (UniqueName: \"kubernetes.io/projected/99f42fb7-eaa5-46d2-9443-81ad7a563cec-kube-api-access-kc24k\") pod \"99f42fb7-eaa5-46d2-9443-81ad7a563cec\" (UID: \"99f42fb7-eaa5-46d2-9443-81ad7a563cec\") " Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.117048 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fff766-35dd-471f-a73b-12d85f5845b0-kubelet-dir\") pod \"37fff766-35dd-471f-a73b-12d85f5845b0\" (UID: \"37fff766-35dd-471f-a73b-12d85f5845b0\") " Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.117324 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37fff766-35dd-471f-a73b-12d85f5845b0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "37fff766-35dd-471f-a73b-12d85f5845b0" (UID: "37fff766-35dd-471f-a73b-12d85f5845b0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.122415 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f42fb7-eaa5-46d2-9443-81ad7a563cec-kube-api-access-kc24k" (OuterVolumeSpecName: "kube-api-access-kc24k") pod "99f42fb7-eaa5-46d2-9443-81ad7a563cec" (UID: "99f42fb7-eaa5-46d2-9443-81ad7a563cec"). InnerVolumeSpecName "kube-api-access-kc24k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.122873 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37fff766-35dd-471f-a73b-12d85f5845b0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "37fff766-35dd-471f-a73b-12d85f5845b0" (UID: "37fff766-35dd-471f-a73b-12d85f5845b0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.157372 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84779b596d-djnnm"] Mar 10 21:54:34 crc kubenswrapper[4919]: W0310 21:54:34.169510 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d63dfa4_4174_484d_a783_cb65de9f56e0.slice/crio-9e7b593e116cc749243cccae96203413fb341311ea6da73526031c11a45e6da2 WatchSource:0}: Error finding container 9e7b593e116cc749243cccae96203413fb341311ea6da73526031c11a45e6da2: Status 404 returned error can't find the container with id 9e7b593e116cc749243cccae96203413fb341311ea6da73526031c11a45e6da2 Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.219604 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fff766-35dd-471f-a73b-12d85f5845b0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.219629 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc24k\" (UniqueName: \"kubernetes.io/projected/99f42fb7-eaa5-46d2-9443-81ad7a563cec-kube-api-access-kc24k\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.219993 4919 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fff766-35dd-471f-a73b-12d85f5845b0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.227280 4919 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-20 03:02:18.36067773 +0000 UTC Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.227302 4919 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6101h7m44.133377754s for next certificate rotation Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.734354 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" event={"ID":"3d63dfa4-4174-484d-a783-cb65de9f56e0","Type":"ContainerStarted","Data":"60d2e2f892aedbbd63d6924d23b6291f36b156f14ceec5d6640b2066688f258b"} Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.734405 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" event={"ID":"3d63dfa4-4174-484d-a783-cb65de9f56e0","Type":"ContainerStarted","Data":"9e7b593e116cc749243cccae96203413fb341311ea6da73526031c11a45e6da2"} Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.735360 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.739231 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552992-kp6wz" event={"ID":"99f42fb7-eaa5-46d2-9443-81ad7a563cec","Type":"ContainerDied","Data":"25c432ccaf55afa18c9c52acdd64b3e00f55f480c278c8fe53ec4ac7febf5842"} Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.739260 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25c432ccaf55afa18c9c52acdd64b3e00f55f480c278c8fe53ec4ac7febf5842" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.739299 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552992-kp6wz" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.739970 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.750327 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" podStartSLOduration=10.750310057 podStartE2EDuration="10.750310057s" podCreationTimestamp="2026-03-10 21:54:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:34.748596211 +0000 UTC m=+261.990476819" watchObservedRunningTime="2026-03-10 21:54:34.750310057 +0000 UTC m=+261.992190665" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.763005 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnq5q" event={"ID":"df08dbe0-09b0-4d23-b99b-95b65818f84e","Type":"ContainerStarted","Data":"106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8"} Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.810037 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd58j" event={"ID":"0c5f7639-6abe-4578-81f0-17691f1ad5ef","Type":"ContainerStarted","Data":"2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04"} Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.812172 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lnq5q" podStartSLOduration=3.442442216 podStartE2EDuration="29.812155418s" podCreationTimestamp="2026-03-10 21:54:05 +0000 UTC" firstStartedPulling="2026-03-10 21:54:07.409287902 +0000 UTC m=+234.651168510" lastFinishedPulling="2026-03-10 21:54:33.779001104 +0000 UTC m=+261.020881712" observedRunningTime="2026-03-10 21:54:34.809934118 +0000 UTC m=+262.051814726" watchObservedRunningTime="2026-03-10 21:54:34.812155418 +0000 UTC m=+262.054036027" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.814369 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.814513 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"37fff766-35dd-471f-a73b-12d85f5845b0","Type":"ContainerDied","Data":"7d7b70e9d254587ba6394813ab5db336d019673e5db6ced154be8e374da1d405"} Mar 10 21:54:34 crc kubenswrapper[4919]: I0310 21:54:34.817135 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7b70e9d254587ba6394813ab5db336d019673e5db6ced154be8e374da1d405" Mar 10 21:54:35 crc kubenswrapper[4919]: I0310 21:54:35.003778 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gd58j" podStartSLOduration=2.674089131 podStartE2EDuration="29.003760034s" podCreationTimestamp="2026-03-10 21:54:06 +0000 UTC" firstStartedPulling="2026-03-10 21:54:07.429385168 +0000 UTC m=+234.671265776" lastFinishedPulling="2026-03-10 21:54:33.759056071 +0000 UTC m=+261.000936679" observedRunningTime="2026-03-10 21:54:34.840464768 +0000 UTC m=+262.082345396" watchObservedRunningTime="2026-03-10 21:54:35.003760034 +0000 UTC m=+262.245640642" Mar 10 21:54:35 crc kubenswrapper[4919]: I0310 21:54:35.828008 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" event={"ID":"cac0bc08-6186-43fb-bebe-036c98331599","Type":"ContainerStarted","Data":"a088399da8fc4e94e5d64d7469a3761c93b686bea11cc5017bc3f52bd0538e1b"} Mar 10 21:54:35 crc kubenswrapper[4919]: I0310 21:54:35.846765 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" podStartSLOduration=1.9837949849999998 podStartE2EDuration="35.84674973s" podCreationTimestamp="2026-03-10 21:54:00 +0000 UTC" firstStartedPulling="2026-03-10 21:54:01.51633294 +0000 UTC m=+228.758213548" lastFinishedPulling="2026-03-10 21:54:35.379287685 +0000 UTC m=+262.621168293" observedRunningTime="2026-03-10 21:54:35.84437441 +0000 UTC m=+263.086255038" watchObservedRunningTime="2026-03-10 21:54:35.84674973 +0000 UTC m=+263.088630338" Mar 10 21:54:36 crc kubenswrapper[4919]: I0310 21:54:36.105855 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:36 crc kubenswrapper[4919]: I0310 21:54:36.105905 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:36 crc kubenswrapper[4919]: I0310 21:54:36.511791 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:36 crc kubenswrapper[4919]: I0310 21:54:36.512099 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:36 crc kubenswrapper[4919]: I0310 21:54:36.561065 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:36 crc kubenswrapper[4919]: I0310 21:54:36.832337 4919 generic.go:334] "Generic (PLEG): container finished" podID="cac0bc08-6186-43fb-bebe-036c98331599" containerID="a088399da8fc4e94e5d64d7469a3761c93b686bea11cc5017bc3f52bd0538e1b" exitCode=0 Mar 10 21:54:36 crc kubenswrapper[4919]: I0310 21:54:36.832470 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" event={"ID":"cac0bc08-6186-43fb-bebe-036c98331599","Type":"ContainerDied","Data":"a088399da8fc4e94e5d64d7469a3761c93b686bea11cc5017bc3f52bd0538e1b"} Mar 10 21:54:37 crc kubenswrapper[4919]: I0310 21:54:37.256952 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-lnq5q" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="registry-server" probeResult="failure" output=< Mar 10 21:54:37 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 21:54:37 crc kubenswrapper[4919]: > Mar 10 21:54:38 crc kubenswrapper[4919]: I0310 21:54:38.116887 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" Mar 10 21:54:38 crc kubenswrapper[4919]: I0310 21:54:38.275448 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp685\" (UniqueName: \"kubernetes.io/projected/cac0bc08-6186-43fb-bebe-036c98331599-kube-api-access-rp685\") pod \"cac0bc08-6186-43fb-bebe-036c98331599\" (UID: \"cac0bc08-6186-43fb-bebe-036c98331599\") " Mar 10 21:54:38 crc kubenswrapper[4919]: I0310 21:54:38.282959 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac0bc08-6186-43fb-bebe-036c98331599-kube-api-access-rp685" (OuterVolumeSpecName: "kube-api-access-rp685") pod "cac0bc08-6186-43fb-bebe-036c98331599" (UID: "cac0bc08-6186-43fb-bebe-036c98331599"). InnerVolumeSpecName "kube-api-access-rp685". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:38 crc kubenswrapper[4919]: I0310 21:54:38.377318 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp685\" (UniqueName: \"kubernetes.io/projected/cac0bc08-6186-43fb-bebe-036c98331599-kube-api-access-rp685\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:38 crc kubenswrapper[4919]: I0310 21:54:38.430227 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n8bb5" Mar 10 21:54:38 crc kubenswrapper[4919]: I0310 21:54:38.848616 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" event={"ID":"cac0bc08-6186-43fb-bebe-036c98331599","Type":"ContainerDied","Data":"616e74b4ce6da29f29710110acc0f985b9f975a960bd4183894373f41cc06a5a"} Mar 10 21:54:38 crc kubenswrapper[4919]: I0310 21:54:38.848882 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616e74b4ce6da29f29710110acc0f985b9f975a960bd4183894373f41cc06a5a" Mar 10 21:54:38 crc kubenswrapper[4919]: I0310 21:54:38.848751 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552994-rvxmh" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.658029 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 21:54:41 crc kubenswrapper[4919]: E0310 21:54:41.658322 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37fff766-35dd-471f-a73b-12d85f5845b0" containerName="pruner" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.658340 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="37fff766-35dd-471f-a73b-12d85f5845b0" containerName="pruner" Mar 10 21:54:41 crc kubenswrapper[4919]: E0310 21:54:41.658356 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac0bc08-6186-43fb-bebe-036c98331599" containerName="oc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.658364 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac0bc08-6186-43fb-bebe-036c98331599" containerName="oc" Mar 10 21:54:41 crc kubenswrapper[4919]: E0310 21:54:41.658389 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f42fb7-eaa5-46d2-9443-81ad7a563cec" containerName="oc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.658412 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f42fb7-eaa5-46d2-9443-81ad7a563cec" containerName="oc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.658525 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac0bc08-6186-43fb-bebe-036c98331599" containerName="oc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.658571 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="37fff766-35dd-471f-a73b-12d85f5845b0" containerName="pruner" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.658586 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f42fb7-eaa5-46d2-9443-81ad7a563cec" containerName="oc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.659127 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.664273 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.664507 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.667167 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.820883 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f47edf2a-c391-4194-9934-ae7a03dc1193-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f47edf2a-c391-4194-9934-ae7a03dc1193\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.820932 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f47edf2a-c391-4194-9934-ae7a03dc1193-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f47edf2a-c391-4194-9934-ae7a03dc1193\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.921631 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f47edf2a-c391-4194-9934-ae7a03dc1193-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f47edf2a-c391-4194-9934-ae7a03dc1193\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.921688 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f47edf2a-c391-4194-9934-ae7a03dc1193-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f47edf2a-c391-4194-9934-ae7a03dc1193\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.921758 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f47edf2a-c391-4194-9934-ae7a03dc1193-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f47edf2a-c391-4194-9934-ae7a03dc1193\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.956484 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f47edf2a-c391-4194-9934-ae7a03dc1193-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f47edf2a-c391-4194-9934-ae7a03dc1193\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:41 crc kubenswrapper[4919]: I0310 21:54:41.990767 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:43 crc kubenswrapper[4919]: I0310 21:54:43.500047 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 21:54:43 crc kubenswrapper[4919]: I0310 21:54:43.874174 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f47edf2a-c391-4194-9934-ae7a03dc1193","Type":"ContainerStarted","Data":"05b818d1199ea0718bc82bb2d4dd603218238fbeee5649cfc9542f62f1d10690"} Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.737370 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84779b596d-djnnm"] Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.737899 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" podUID="3d63dfa4-4174-484d-a783-cb65de9f56e0" containerName="controller-manager" containerID="cri-o://60d2e2f892aedbbd63d6924d23b6291f36b156f14ceec5d6640b2066688f258b" gracePeriod=30 Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.831349 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh"] Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.831604 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" podUID="ec35e4f4-8013-49e0-ad6e-efaede5a41ca" containerName="route-controller-manager" containerID="cri-o://9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b" gracePeriod=30 Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.881195 4919 generic.go:334] "Generic (PLEG): container finished" podID="3d63dfa4-4174-484d-a783-cb65de9f56e0" containerID="60d2e2f892aedbbd63d6924d23b6291f36b156f14ceec5d6640b2066688f258b" exitCode=0 Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.881262 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" event={"ID":"3d63dfa4-4174-484d-a783-cb65de9f56e0","Type":"ContainerDied","Data":"60d2e2f892aedbbd63d6924d23b6291f36b156f14ceec5d6640b2066688f258b"} Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.884246 4919 generic.go:334] "Generic (PLEG): container finished" podID="f47edf2a-c391-4194-9934-ae7a03dc1193" containerID="7833516d81585f7c3b9b6d6321e754c760991b984341ea90fcf7a9b10dc162f9" exitCode=0 Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.884315 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f47edf2a-c391-4194-9934-ae7a03dc1193","Type":"ContainerDied","Data":"7833516d81585f7c3b9b6d6321e754c760991b984341ea90fcf7a9b10dc162f9"} Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.888731 4919 generic.go:334] "Generic (PLEG): container finished" podID="28b0abdd-217d-42f6-80fb-b270be44700e" containerID="b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1" exitCode=0 Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.888773 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4nt4" event={"ID":"28b0abdd-217d-42f6-80fb-b270be44700e","Type":"ContainerDied","Data":"b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1"} Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.891612 4919 generic.go:334] "Generic (PLEG): container finished" podID="e2db42cd-0c43-41be-a881-199c82f703bd" containerID="5a8182ce24c07867f4629d23a7934b9e5bdc6e31ecb798c4cdbd2bb7f394aaa7" exitCode=0 Mar 10 21:54:44 crc kubenswrapper[4919]: I0310 21:54:44.891649 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4l8gq" event={"ID":"e2db42cd-0c43-41be-a881-199c82f703bd","Type":"ContainerDied","Data":"5a8182ce24c07867f4629d23a7934b9e5bdc6e31ecb798c4cdbd2bb7f394aaa7"} Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.292129 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.297024 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.473832 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-client-ca\") pod \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.473889 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d63dfa4-4174-484d-a783-cb65de9f56e0-serving-cert\") pod \"3d63dfa4-4174-484d-a783-cb65de9f56e0\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.473933 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-serving-cert\") pod \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.473953 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-config\") pod \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.473974 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94shs\" (UniqueName: \"kubernetes.io/projected/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-kube-api-access-94shs\") pod \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\" (UID: \"ec35e4f4-8013-49e0-ad6e-efaede5a41ca\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.474031 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c75jq\" (UniqueName: \"kubernetes.io/projected/3d63dfa4-4174-484d-a783-cb65de9f56e0-kube-api-access-c75jq\") pod \"3d63dfa4-4174-484d-a783-cb65de9f56e0\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.474056 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-config\") pod \"3d63dfa4-4174-484d-a783-cb65de9f56e0\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.474097 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-client-ca\") pod \"3d63dfa4-4174-484d-a783-cb65de9f56e0\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.474132 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-proxy-ca-bundles\") pod \"3d63dfa4-4174-484d-a783-cb65de9f56e0\" (UID: \"3d63dfa4-4174-484d-a783-cb65de9f56e0\") " Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.475098 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-config" (OuterVolumeSpecName: "config") pod "ec35e4f4-8013-49e0-ad6e-efaede5a41ca" (UID: "ec35e4f4-8013-49e0-ad6e-efaede5a41ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.475423 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-client-ca" (OuterVolumeSpecName: "client-ca") pod "3d63dfa4-4174-484d-a783-cb65de9f56e0" (UID: "3d63dfa4-4174-484d-a783-cb65de9f56e0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.475597 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec35e4f4-8013-49e0-ad6e-efaede5a41ca" (UID: "ec35e4f4-8013-49e0-ad6e-efaede5a41ca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.475770 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3d63dfa4-4174-484d-a783-cb65de9f56e0" (UID: "3d63dfa4-4174-484d-a783-cb65de9f56e0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.476309 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-config" (OuterVolumeSpecName: "config") pod "3d63dfa4-4174-484d-a783-cb65de9f56e0" (UID: "3d63dfa4-4174-484d-a783-cb65de9f56e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.481530 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d63dfa4-4174-484d-a783-cb65de9f56e0-kube-api-access-c75jq" (OuterVolumeSpecName: "kube-api-access-c75jq") pod "3d63dfa4-4174-484d-a783-cb65de9f56e0" (UID: "3d63dfa4-4174-484d-a783-cb65de9f56e0"). InnerVolumeSpecName "kube-api-access-c75jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.484056 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec35e4f4-8013-49e0-ad6e-efaede5a41ca" (UID: "ec35e4f4-8013-49e0-ad6e-efaede5a41ca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.484717 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d63dfa4-4174-484d-a783-cb65de9f56e0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3d63dfa4-4174-484d-a783-cb65de9f56e0" (UID: "3d63dfa4-4174-484d-a783-cb65de9f56e0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.485556 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-kube-api-access-94shs" (OuterVolumeSpecName: "kube-api-access-94shs") pod "ec35e4f4-8013-49e0-ad6e-efaede5a41ca" (UID: "ec35e4f4-8013-49e0-ad6e-efaede5a41ca"). InnerVolumeSpecName "kube-api-access-94shs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575433 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575476 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575489 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d63dfa4-4174-484d-a783-cb65de9f56e0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575500 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575514 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575526 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94shs\" (UniqueName: \"kubernetes.io/projected/ec35e4f4-8013-49e0-ad6e-efaede5a41ca-kube-api-access-94shs\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575539 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c75jq\" (UniqueName: \"kubernetes.io/projected/3d63dfa4-4174-484d-a783-cb65de9f56e0-kube-api-access-c75jq\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575551 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.575561 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d63dfa4-4174-484d-a783-cb65de9f56e0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.822365 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz"] Mar 10 21:54:45 crc kubenswrapper[4919]: E0310 21:54:45.822626 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec35e4f4-8013-49e0-ad6e-efaede5a41ca" containerName="route-controller-manager" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.822641 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec35e4f4-8013-49e0-ad6e-efaede5a41ca" containerName="route-controller-manager" Mar 10 21:54:45 crc kubenswrapper[4919]: E0310 21:54:45.822661 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d63dfa4-4174-484d-a783-cb65de9f56e0" containerName="controller-manager" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.822669 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d63dfa4-4174-484d-a783-cb65de9f56e0" containerName="controller-manager" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.822801 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec35e4f4-8013-49e0-ad6e-efaede5a41ca" containerName="route-controller-manager" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.822815 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d63dfa4-4174-484d-a783-cb65de9f56e0" containerName="controller-manager" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.823260 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.839655 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz"] Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.879679 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-proxy-ca-bundles\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.879724 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjt4\" (UniqueName: \"kubernetes.io/projected/58d4ac79-fba2-4226-94fd-2a255f79907c-kube-api-access-ktjt4\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.879755 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-config\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.879800 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-client-ca\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.879876 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58d4ac79-fba2-4226-94fd-2a255f79907c-serving-cert\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.898222 4919 generic.go:334] "Generic (PLEG): container finished" podID="ec35e4f4-8013-49e0-ad6e-efaede5a41ca" containerID="9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b" exitCode=0 Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.898292 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" event={"ID":"ec35e4f4-8013-49e0-ad6e-efaede5a41ca","Type":"ContainerDied","Data":"9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b"} Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.898324 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" event={"ID":"ec35e4f4-8013-49e0-ad6e-efaede5a41ca","Type":"ContainerDied","Data":"60f6d73f868fa840720239c52f519c36bd0147b7eb48d6c420e27b51b29e5d60"} Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.898346 4919 scope.go:117] "RemoveContainer" containerID="9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.898506 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.904285 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4nt4" event={"ID":"28b0abdd-217d-42f6-80fb-b270be44700e","Type":"ContainerStarted","Data":"675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2"} Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.907376 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4l8gq" event={"ID":"e2db42cd-0c43-41be-a881-199c82f703bd","Type":"ContainerStarted","Data":"2b8fda52ce376caf909f426ee3617db20d9b419d6dabc7b1b492e41f452da589"} Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.913822 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" event={"ID":"3d63dfa4-4174-484d-a783-cb65de9f56e0","Type":"ContainerDied","Data":"9e7b593e116cc749243cccae96203413fb341311ea6da73526031c11a45e6da2"} Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.913864 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84779b596d-djnnm" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.921188 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh"] Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.921250 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9c7cf599-w5hkh"] Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.936409 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f4nt4" podStartSLOduration=25.149248898 podStartE2EDuration="38.936358243s" podCreationTimestamp="2026-03-10 21:54:07 +0000 UTC" firstStartedPulling="2026-03-10 21:54:31.490021684 +0000 UTC m=+258.731902292" lastFinishedPulling="2026-03-10 21:54:45.277131029 +0000 UTC m=+272.519011637" observedRunningTime="2026-03-10 21:54:45.928875914 +0000 UTC m=+273.170756522" watchObservedRunningTime="2026-03-10 21:54:45.936358243 +0000 UTC m=+273.178238861" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.941527 4919 scope.go:117] "RemoveContainer" containerID="9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b" Mar 10 21:54:45 crc kubenswrapper[4919]: E0310 21:54:45.943635 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b\": container with ID starting with 9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b not found: ID does not exist" containerID="9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.943689 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b"} err="failed to get container status \"9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b\": rpc error: code = NotFound desc = could not find container \"9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b\": container with ID starting with 9dc2a9d491c74c8f020a2b5e5b86ca4947f66441f6260e315a7d800e1ab5fc4b not found: ID does not exist" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.943718 4919 scope.go:117] "RemoveContainer" containerID="60d2e2f892aedbbd63d6924d23b6291f36b156f14ceec5d6640b2066688f258b" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.953197 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4l8gq" podStartSLOduration=26.249584144 podStartE2EDuration="38.953178494s" podCreationTimestamp="2026-03-10 21:54:07 +0000 UTC" firstStartedPulling="2026-03-10 21:54:32.684459191 +0000 UTC m=+259.926339799" lastFinishedPulling="2026-03-10 21:54:45.388053541 +0000 UTC m=+272.629934149" observedRunningTime="2026-03-10 21:54:45.949863161 +0000 UTC m=+273.191743779" watchObservedRunningTime="2026-03-10 21:54:45.953178494 +0000 UTC m=+273.195059112" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.964457 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84779b596d-djnnm"] Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.969031 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-84779b596d-djnnm"] Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.981551 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-config\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.981602 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-client-ca\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.981660 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58d4ac79-fba2-4226-94fd-2a255f79907c-serving-cert\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.981721 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-proxy-ca-bundles\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.981741 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjt4\" (UniqueName: \"kubernetes.io/projected/58d4ac79-fba2-4226-94fd-2a255f79907c-kube-api-access-ktjt4\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.983322 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-config\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.984001 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-client-ca\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.985539 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-proxy-ca-bundles\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:45 crc kubenswrapper[4919]: I0310 21:54:45.988047 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58d4ac79-fba2-4226-94fd-2a255f79907c-serving-cert\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.005044 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjt4\" (UniqueName: \"kubernetes.io/projected/58d4ac79-fba2-4226-94fd-2a255f79907c-kube-api-access-ktjt4\") pod \"controller-manager-7789f7fd8d-w9gvz\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.140411 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.146563 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.161214 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.206716 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.285904 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f47edf2a-c391-4194-9934-ae7a03dc1193-kubelet-dir\") pod \"f47edf2a-c391-4194-9934-ae7a03dc1193\" (UID: \"f47edf2a-c391-4194-9934-ae7a03dc1193\") " Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.285988 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f47edf2a-c391-4194-9934-ae7a03dc1193-kube-api-access\") pod \"f47edf2a-c391-4194-9934-ae7a03dc1193\" (UID: \"f47edf2a-c391-4194-9934-ae7a03dc1193\") " Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.286459 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f47edf2a-c391-4194-9934-ae7a03dc1193-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f47edf2a-c391-4194-9934-ae7a03dc1193" (UID: "f47edf2a-c391-4194-9934-ae7a03dc1193"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.291505 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f47edf2a-c391-4194-9934-ae7a03dc1193-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f47edf2a-c391-4194-9934-ae7a03dc1193" (UID: "f47edf2a-c391-4194-9934-ae7a03dc1193"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.387141 4919 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f47edf2a-c391-4194-9934-ae7a03dc1193-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.387176 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f47edf2a-c391-4194-9934-ae7a03dc1193-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.551498 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.622457 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz"] Mar 10 21:54:46 crc kubenswrapper[4919]: W0310 21:54:46.629038 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d4ac79_fba2_4226_94fd_2a255f79907c.slice/crio-d080163e986b34cd835b37dcbb7126850b5b2cd75f9bdb117722f7c1719baaee WatchSource:0}: Error finding container d080163e986b34cd835b37dcbb7126850b5b2cd75f9bdb117722f7c1719baaee: Status 404 returned error can't find the container with id d080163e986b34cd835b37dcbb7126850b5b2cd75f9bdb117722f7c1719baaee Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.821865 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr"] Mar 10 21:54:46 crc kubenswrapper[4919]: E0310 21:54:46.822432 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f47edf2a-c391-4194-9934-ae7a03dc1193" containerName="pruner" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.822447 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f47edf2a-c391-4194-9934-ae7a03dc1193" containerName="pruner" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.822543 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f47edf2a-c391-4194-9934-ae7a03dc1193" containerName="pruner" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.822921 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.825018 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.826420 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.826587 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.826748 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.826778 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.827060 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.830609 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr"] Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.854972 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.855603 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.868148 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.930367 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" event={"ID":"58d4ac79-fba2-4226-94fd-2a255f79907c","Type":"ContainerStarted","Data":"cba0b9d9a22f94fd14c49643fea816d363de842d4153edea17a6045188b2aa81"} Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.930758 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" event={"ID":"58d4ac79-fba2-4226-94fd-2a255f79907c","Type":"ContainerStarted","Data":"d080163e986b34cd835b37dcbb7126850b5b2cd75f9bdb117722f7c1719baaee"} Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.932725 4919 generic.go:334] "Generic (PLEG): container finished" podID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerID="5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7" exitCode=0 Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.932779 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8qvz" event={"ID":"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35","Type":"ContainerDied","Data":"5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7"} Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.934023 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.938820 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f47edf2a-c391-4194-9934-ae7a03dc1193","Type":"ContainerDied","Data":"05b818d1199ea0718bc82bb2d4dd603218238fbeee5649cfc9542f62f1d10690"} Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.939174 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05b818d1199ea0718bc82bb2d4dd603218238fbeee5649cfc9542f62f1d10690" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.938936 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.940689 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.942125 4919 generic.go:334] "Generic (PLEG): container finished" podID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerID="55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d" exitCode=0 Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.942499 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw22n" event={"ID":"ccd7b53d-726b-444f-be0f-4eb2655eb35d","Type":"ContainerDied","Data":"55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d"} Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.948316 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" podStartSLOduration=2.948297462 podStartE2EDuration="2.948297462s" podCreationTimestamp="2026-03-10 21:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:46.945066712 +0000 UTC m=+274.186947320" watchObservedRunningTime="2026-03-10 21:54:46.948297462 +0000 UTC m=+274.190178070" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.994430 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9rdz\" (UniqueName: \"kubernetes.io/projected/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-kube-api-access-b9rdz\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.994475 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-serving-cert\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.994519 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-config\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.994539 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-client-ca\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.994604 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df1d5909-9408-4ac0-a762-5298b301ae59-kube-api-access\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.994663 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-var-lock\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:46 crc kubenswrapper[4919]: I0310 21:54:46.994693 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-kubelet-dir\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.096056 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-config\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.096127 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-client-ca\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.096182 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df1d5909-9408-4ac0-a762-5298b301ae59-kube-api-access\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.096273 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-var-lock\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.096299 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-kubelet-dir\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.096325 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9rdz\" (UniqueName: \"kubernetes.io/projected/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-kube-api-access-b9rdz\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.096353 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-serving-cert\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.098199 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-client-ca\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.098443 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-var-lock\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.098468 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-kubelet-dir\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.098520 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-config\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.104416 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-serving-cert\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.113976 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df1d5909-9408-4ac0-a762-5298b301ae59-kube-api-access\") pod \"installer-9-crc\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.143984 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9rdz\" (UniqueName: \"kubernetes.io/projected/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-kube-api-access-b9rdz\") pod \"route-controller-manager-654c59c45d-m2rmr\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.193571 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.200890 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.531582 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d63dfa4-4174-484d-a783-cb65de9f56e0" path="/var/lib/kubelet/pods/3d63dfa4-4174-484d-a783-cb65de9f56e0/volumes" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.551718 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec35e4f4-8013-49e0-ad6e-efaede5a41ca" path="/var/lib/kubelet/pods/ec35e4f4-8013-49e0-ad6e-efaede5a41ca/volumes" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.552370 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.552421 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr"] Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.552441 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.606020 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.933720 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.933933 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.971124 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8qvz" event={"ID":"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35","Type":"ContainerStarted","Data":"7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1"} Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.972903 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw22n" event={"ID":"ccd7b53d-726b-444f-be0f-4eb2655eb35d","Type":"ContainerStarted","Data":"f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2"} Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.974940 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74hb6" event={"ID":"a979e53c-0904-4fc0-9ef4-16706a351785","Type":"ContainerStarted","Data":"da69317c0da665623e9554783c692e0c6676b0d907d7cb7db923464e12dc4fa0"} Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.976872 4919 generic.go:334] "Generic (PLEG): container finished" podID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerID="26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652" exitCode=0 Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.976926 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx4lk" event={"ID":"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c","Type":"ContainerDied","Data":"26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652"} Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.979434 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"df1d5909-9408-4ac0-a762-5298b301ae59","Type":"ContainerStarted","Data":"d61e923efa60cc26c36f6f8896ffa2d064505f57043c37b69061f529c5a0bc7b"} Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.981928 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" event={"ID":"22f3b36f-54ca-4eaf-bde1-c6f104b0a500","Type":"ContainerStarted","Data":"964e1562e78301ab50d0185926b887f46260604c13cdfff52663268f8e43c298"} Mar 10 21:54:47 crc kubenswrapper[4919]: I0310 21:54:47.981957 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" event={"ID":"22f3b36f-54ca-4eaf-bde1-c6f104b0a500","Type":"ContainerStarted","Data":"02cf89a9896b087d6a45ff51be59c4426d3ff5506c4fd826fa80ef5e396bce75"} Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.029432 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s8qvz" podStartSLOduration=3.943191443 podStartE2EDuration="45.029417849s" podCreationTimestamp="2026-03-10 21:54:03 +0000 UTC" firstStartedPulling="2026-03-10 21:54:06.319680514 +0000 UTC m=+233.561561122" lastFinishedPulling="2026-03-10 21:54:47.40590692 +0000 UTC m=+274.647787528" observedRunningTime="2026-03-10 21:54:48.000557755 +0000 UTC m=+275.242438383" watchObservedRunningTime="2026-03-10 21:54:48.029417849 +0000 UTC m=+275.271298457" Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.050286 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" podStartSLOduration=4.050268921 podStartE2EDuration="4.050268921s" podCreationTimestamp="2026-03-10 21:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:48.047854341 +0000 UTC m=+275.289734949" watchObservedRunningTime="2026-03-10 21:54:48.050268921 +0000 UTC m=+275.292149529" Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.085644 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pw22n" podStartSLOduration=3.90145978 podStartE2EDuration="45.085622749s" podCreationTimestamp="2026-03-10 21:54:03 +0000 UTC" firstStartedPulling="2026-03-10 21:54:06.331138915 +0000 UTC m=+233.573019523" lastFinishedPulling="2026-03-10 21:54:47.515301884 +0000 UTC m=+274.757182492" observedRunningTime="2026-03-10 21:54:48.085528396 +0000 UTC m=+275.327408994" watchObservedRunningTime="2026-03-10 21:54:48.085622749 +0000 UTC m=+275.327503357" Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.586309 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f4nt4" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="registry-server" probeResult="failure" output=< Mar 10 21:54:48 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 21:54:48 crc kubenswrapper[4919]: > Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.976425 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4l8gq" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="registry-server" probeResult="failure" output=< Mar 10 21:54:48 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 21:54:48 crc kubenswrapper[4919]: > Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.988317 4919 generic.go:334] "Generic (PLEG): container finished" podID="a979e53c-0904-4fc0-9ef4-16706a351785" containerID="da69317c0da665623e9554783c692e0c6676b0d907d7cb7db923464e12dc4fa0" exitCode=0 Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.988427 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74hb6" event={"ID":"a979e53c-0904-4fc0-9ef4-16706a351785","Type":"ContainerDied","Data":"da69317c0da665623e9554783c692e0c6676b0d907d7cb7db923464e12dc4fa0"} Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.990332 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx4lk" event={"ID":"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c","Type":"ContainerStarted","Data":"07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390"} Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.991956 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"df1d5909-9408-4ac0-a762-5298b301ae59","Type":"ContainerStarted","Data":"bd1de06a2a06203a5013c9e513925456cdfb0378e2f5fd7e5515cc0ababe9a26"} Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.992178 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:48 crc kubenswrapper[4919]: I0310 21:54:48.998438 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:54:49 crc kubenswrapper[4919]: I0310 21:54:49.020742 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.020725301 podStartE2EDuration="3.020725301s" podCreationTimestamp="2026-03-10 21:54:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:54:49.018244589 +0000 UTC m=+276.260125197" watchObservedRunningTime="2026-03-10 21:54:49.020725301 +0000 UTC m=+276.262605899" Mar 10 21:54:49 crc kubenswrapper[4919]: I0310 21:54:49.053902 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bx4lk" podStartSLOduration=2.907290969 podStartE2EDuration="45.053885922s" podCreationTimestamp="2026-03-10 21:54:04 +0000 UTC" firstStartedPulling="2026-03-10 21:54:06.347683155 +0000 UTC m=+233.589563763" lastFinishedPulling="2026-03-10 21:54:48.494278108 +0000 UTC m=+275.736158716" observedRunningTime="2026-03-10 21:54:49.049746248 +0000 UTC m=+276.291626866" watchObservedRunningTime="2026-03-10 21:54:49.053885922 +0000 UTC m=+276.295766550" Mar 10 21:54:49 crc kubenswrapper[4919]: I0310 21:54:49.922378 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd58j"] Mar 10 21:54:49 crc kubenswrapper[4919]: I0310 21:54:49.922887 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gd58j" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerName="registry-server" containerID="cri-o://2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04" gracePeriod=2 Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.368278 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.455411 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gzzm\" (UniqueName: \"kubernetes.io/projected/0c5f7639-6abe-4578-81f0-17691f1ad5ef-kube-api-access-4gzzm\") pod \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.455500 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-catalog-content\") pod \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.455537 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-utilities\") pod \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\" (UID: \"0c5f7639-6abe-4578-81f0-17691f1ad5ef\") " Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.456292 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-utilities" (OuterVolumeSpecName: "utilities") pod "0c5f7639-6abe-4578-81f0-17691f1ad5ef" (UID: "0c5f7639-6abe-4578-81f0-17691f1ad5ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.460267 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c5f7639-6abe-4578-81f0-17691f1ad5ef-kube-api-access-4gzzm" (OuterVolumeSpecName: "kube-api-access-4gzzm") pod "0c5f7639-6abe-4578-81f0-17691f1ad5ef" (UID: "0c5f7639-6abe-4578-81f0-17691f1ad5ef"). InnerVolumeSpecName "kube-api-access-4gzzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.486470 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c5f7639-6abe-4578-81f0-17691f1ad5ef" (UID: "0c5f7639-6abe-4578-81f0-17691f1ad5ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.556524 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.556551 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5f7639-6abe-4578-81f0-17691f1ad5ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:50 crc kubenswrapper[4919]: I0310 21:54:50.556560 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gzzm\" (UniqueName: \"kubernetes.io/projected/0c5f7639-6abe-4578-81f0-17691f1ad5ef-kube-api-access-4gzzm\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.007127 4919 generic.go:334] "Generic (PLEG): container finished" podID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerID="2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04" exitCode=0 Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.007452 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd58j" event={"ID":"0c5f7639-6abe-4578-81f0-17691f1ad5ef","Type":"ContainerDied","Data":"2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04"} Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.007479 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd58j" event={"ID":"0c5f7639-6abe-4578-81f0-17691f1ad5ef","Type":"ContainerDied","Data":"477a30eb110821e742b920f7a6eca55e0ffcb85c3897be76e5867cea56215a7c"} Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.007496 4919 scope.go:117] "RemoveContainer" containerID="2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.007656 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gd58j" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.018973 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74hb6" event={"ID":"a979e53c-0904-4fc0-9ef4-16706a351785","Type":"ContainerStarted","Data":"cecc1c915d7f4c05bbb55b52a7b0867b7bac85ca5fb085c748cb16256afcc645"} Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.035710 4919 scope.go:117] "RemoveContainer" containerID="b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.043124 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-74hb6" podStartSLOduration=3.180545023 podStartE2EDuration="47.043107253s" podCreationTimestamp="2026-03-10 21:54:04 +0000 UTC" firstStartedPulling="2026-03-10 21:54:06.347332476 +0000 UTC m=+233.589213084" lastFinishedPulling="2026-03-10 21:54:50.209894716 +0000 UTC m=+277.451775314" observedRunningTime="2026-03-10 21:54:51.041561944 +0000 UTC m=+278.283442572" watchObservedRunningTime="2026-03-10 21:54:51.043107253 +0000 UTC m=+278.284987861" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.057518 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd58j"] Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.062949 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd58j"] Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.066851 4919 scope.go:117] "RemoveContainer" containerID="9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.080517 4919 scope.go:117] "RemoveContainer" containerID="2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04" Mar 10 21:54:51 crc kubenswrapper[4919]: E0310 21:54:51.080913 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04\": container with ID starting with 2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04 not found: ID does not exist" containerID="2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.080956 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04"} err="failed to get container status \"2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04\": rpc error: code = NotFound desc = could not find container \"2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04\": container with ID starting with 2eb5d932d4f718b787b32fea2f80042b5b41716a1c318bbfc0bb891a5200df04 not found: ID does not exist" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.080982 4919 scope.go:117] "RemoveContainer" containerID="b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31" Mar 10 21:54:51 crc kubenswrapper[4919]: E0310 21:54:51.081197 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31\": container with ID starting with b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31 not found: ID does not exist" containerID="b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.081226 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31"} err="failed to get container status \"b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31\": rpc error: code = NotFound desc = could not find container \"b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31\": container with ID starting with b18e2171dfd5e3b855b2022831d88f1d7da29e7b880185af0e35dd72216c8d31 not found: ID does not exist" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.081243 4919 scope.go:117] "RemoveContainer" containerID="9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e" Mar 10 21:54:51 crc kubenswrapper[4919]: E0310 21:54:51.081443 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e\": container with ID starting with 9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e not found: ID does not exist" containerID="9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.081479 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e"} err="failed to get container status \"9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e\": rpc error: code = NotFound desc = could not find container \"9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e\": container with ID starting with 9d2e9cf8fbf79ba030395b11dd87382f6914cabf8a0d93f64da8aa8899bdb84e not found: ID does not exist" Mar 10 21:54:51 crc kubenswrapper[4919]: I0310 21:54:51.488090 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" path="/var/lib/kubelet/pods/0c5f7639-6abe-4578-81f0-17691f1ad5ef/volumes" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.141172 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.141600 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.184626 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.286572 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.286636 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.320581 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.569493 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.569570 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.621376 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.895194 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.895284 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:54 crc kubenswrapper[4919]: I0310 21:54:54.953969 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:55 crc kubenswrapper[4919]: I0310 21:54:55.100362 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:54:55 crc kubenswrapper[4919]: I0310 21:54:55.101577 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:54:55 crc kubenswrapper[4919]: I0310 21:54:55.111891 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:55 crc kubenswrapper[4919]: I0310 21:54:55.115253 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:56 crc kubenswrapper[4919]: I0310 21:54:56.537707 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74hb6"] Mar 10 21:54:57 crc kubenswrapper[4919]: I0310 21:54:57.060559 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-74hb6" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" containerName="registry-server" containerID="cri-o://cecc1c915d7f4c05bbb55b52a7b0867b7bac85ca5fb085c748cb16256afcc645" gracePeriod=2 Mar 10 21:54:57 crc kubenswrapper[4919]: I0310 21:54:57.607092 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:57 crc kubenswrapper[4919]: I0310 21:54:57.674705 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:54:57 crc kubenswrapper[4919]: I0310 21:54:57.968838 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.014222 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.070051 4919 generic.go:334] "Generic (PLEG): container finished" podID="a979e53c-0904-4fc0-9ef4-16706a351785" containerID="cecc1c915d7f4c05bbb55b52a7b0867b7bac85ca5fb085c748cb16256afcc645" exitCode=0 Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.070120 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74hb6" event={"ID":"a979e53c-0904-4fc0-9ef4-16706a351785","Type":"ContainerDied","Data":"cecc1c915d7f4c05bbb55b52a7b0867b7bac85ca5fb085c748cb16256afcc645"} Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.256866 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.323746 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bx4lk"] Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.324003 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bx4lk" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerName="registry-server" containerID="cri-o://07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390" gracePeriod=2 Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.360801 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-catalog-content\") pod \"a979e53c-0904-4fc0-9ef4-16706a351785\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.360935 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-utilities\") pod \"a979e53c-0904-4fc0-9ef4-16706a351785\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.361176 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz2bm\" (UniqueName: \"kubernetes.io/projected/a979e53c-0904-4fc0-9ef4-16706a351785-kube-api-access-sz2bm\") pod \"a979e53c-0904-4fc0-9ef4-16706a351785\" (UID: \"a979e53c-0904-4fc0-9ef4-16706a351785\") " Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.361675 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-utilities" (OuterVolumeSpecName: "utilities") pod "a979e53c-0904-4fc0-9ef4-16706a351785" (UID: "a979e53c-0904-4fc0-9ef4-16706a351785"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.361764 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.377620 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a979e53c-0904-4fc0-9ef4-16706a351785-kube-api-access-sz2bm" (OuterVolumeSpecName: "kube-api-access-sz2bm") pod "a979e53c-0904-4fc0-9ef4-16706a351785" (UID: "a979e53c-0904-4fc0-9ef4-16706a351785"). InnerVolumeSpecName "kube-api-access-sz2bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.410016 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a979e53c-0904-4fc0-9ef4-16706a351785" (UID: "a979e53c-0904-4fc0-9ef4-16706a351785"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.463145 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz2bm\" (UniqueName: \"kubernetes.io/projected/a979e53c-0904-4fc0-9ef4-16706a351785-kube-api-access-sz2bm\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.463198 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a979e53c-0904-4fc0-9ef4-16706a351785-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.736978 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.867432 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g7mn\" (UniqueName: \"kubernetes.io/projected/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-kube-api-access-4g7mn\") pod \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.867486 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-utilities\") pod \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.867512 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-catalog-content\") pod \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\" (UID: \"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c\") " Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.868541 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-utilities" (OuterVolumeSpecName: "utilities") pod "fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" (UID: "fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.871992 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-kube-api-access-4g7mn" (OuterVolumeSpecName: "kube-api-access-4g7mn") pod "fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" (UID: "fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c"). InnerVolumeSpecName "kube-api-access-4g7mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.947130 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" (UID: "fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.969701 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g7mn\" (UniqueName: \"kubernetes.io/projected/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-kube-api-access-4g7mn\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.969740 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:58 crc kubenswrapper[4919]: I0310 21:54:58.969758 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.078005 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74hb6" event={"ID":"a979e53c-0904-4fc0-9ef4-16706a351785","Type":"ContainerDied","Data":"b33a730df7ddb558fae74044065c6398a2adf5d959ddeb4b1104ee7d1c126c8f"} Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.078016 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74hb6" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.078076 4919 scope.go:117] "RemoveContainer" containerID="cecc1c915d7f4c05bbb55b52a7b0867b7bac85ca5fb085c748cb16256afcc645" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.083757 4919 generic.go:334] "Generic (PLEG): container finished" podID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerID="07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390" exitCode=0 Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.083796 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx4lk" event={"ID":"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c","Type":"ContainerDied","Data":"07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390"} Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.083825 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bx4lk" event={"ID":"fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c","Type":"ContainerDied","Data":"2431ebd0434b16faef3c5e6f004040d9545a75d04c022bbfb7669d4e1432d044"} Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.083884 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bx4lk" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.098182 4919 scope.go:117] "RemoveContainer" containerID="da69317c0da665623e9554783c692e0c6676b0d907d7cb7db923464e12dc4fa0" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.113363 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74hb6"] Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.116789 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-74hb6"] Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.134519 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bx4lk"] Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.141039 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bx4lk"] Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.141654 4919 scope.go:117] "RemoveContainer" containerID="a660cc17c504e1444f4dd457f548fffa6b0b5534922fc54ba3f482c9a3bca649" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.159214 4919 scope.go:117] "RemoveContainer" containerID="07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.174200 4919 scope.go:117] "RemoveContainer" containerID="26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.175326 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.175366 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.175423 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.175920 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.175975 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f" gracePeriod=600 Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.230069 4919 scope.go:117] "RemoveContainer" containerID="bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.241488 4919 scope.go:117] "RemoveContainer" containerID="07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390" Mar 10 21:54:59 crc kubenswrapper[4919]: E0310 21:54:59.241826 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390\": container with ID starting with 07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390 not found: ID does not exist" containerID="07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.241871 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390"} err="failed to get container status \"07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390\": rpc error: code = NotFound desc = could not find container \"07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390\": container with ID starting with 07b6018309a2b6064229c613542dbbbf15c56d5c0393f34e085acf92affff390 not found: ID does not exist" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.241899 4919 scope.go:117] "RemoveContainer" containerID="26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652" Mar 10 21:54:59 crc kubenswrapper[4919]: E0310 21:54:59.242136 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652\": container with ID starting with 26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652 not found: ID does not exist" containerID="26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.242163 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652"} err="failed to get container status \"26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652\": rpc error: code = NotFound desc = could not find container \"26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652\": container with ID starting with 26bec9d31bb9f2294e6bda808e55542a397e059992c2e0abbcef1a37c96fd652 not found: ID does not exist" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.242184 4919 scope.go:117] "RemoveContainer" containerID="bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377" Mar 10 21:54:59 crc kubenswrapper[4919]: E0310 21:54:59.242430 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377\": container with ID starting with bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377 not found: ID does not exist" containerID="bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.242465 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377"} err="failed to get container status \"bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377\": rpc error: code = NotFound desc = could not find container \"bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377\": container with ID starting with bd75b3e108084779c92a9028faaa7cb89e4c5173c5911c2ae9f686335fc6e377 not found: ID does not exist" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.488757 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" path="/var/lib/kubelet/pods/a979e53c-0904-4fc0-9ef4-16706a351785/volumes" Mar 10 21:54:59 crc kubenswrapper[4919]: I0310 21:54:59.490237 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" path="/var/lib/kubelet/pods/fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c/volumes" Mar 10 21:55:00 crc kubenswrapper[4919]: I0310 21:55:00.093769 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f" exitCode=0 Mar 10 21:55:00 crc kubenswrapper[4919]: I0310 21:55:00.093837 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f"} Mar 10 21:55:00 crc kubenswrapper[4919]: I0310 21:55:00.094457 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"67ebb482e04a382aaf058b1f3caaeb6cdcd6b9d8d58f43f74fc1f837f6010a5f"} Mar 10 21:55:00 crc kubenswrapper[4919]: I0310 21:55:00.726914 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4l8gq"] Mar 10 21:55:00 crc kubenswrapper[4919]: I0310 21:55:00.727277 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4l8gq" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="registry-server" containerID="cri-o://2b8fda52ce376caf909f426ee3617db20d9b419d6dabc7b1b492e41f452da589" gracePeriod=2 Mar 10 21:55:01 crc kubenswrapper[4919]: I0310 21:55:01.107979 4919 generic.go:334] "Generic (PLEG): container finished" podID="e2db42cd-0c43-41be-a881-199c82f703bd" containerID="2b8fda52ce376caf909f426ee3617db20d9b419d6dabc7b1b492e41f452da589" exitCode=0 Mar 10 21:55:01 crc kubenswrapper[4919]: I0310 21:55:01.108572 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4l8gq" event={"ID":"e2db42cd-0c43-41be-a881-199c82f703bd","Type":"ContainerDied","Data":"2b8fda52ce376caf909f426ee3617db20d9b419d6dabc7b1b492e41f452da589"} Mar 10 21:55:01 crc kubenswrapper[4919]: I0310 21:55:01.783327 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:55:01 crc kubenswrapper[4919]: I0310 21:55:01.905797 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn87z\" (UniqueName: \"kubernetes.io/projected/e2db42cd-0c43-41be-a881-199c82f703bd-kube-api-access-wn87z\") pod \"e2db42cd-0c43-41be-a881-199c82f703bd\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " Mar 10 21:55:01 crc kubenswrapper[4919]: I0310 21:55:01.905917 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-catalog-content\") pod \"e2db42cd-0c43-41be-a881-199c82f703bd\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " Mar 10 21:55:01 crc kubenswrapper[4919]: I0310 21:55:01.906009 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-utilities\") pod \"e2db42cd-0c43-41be-a881-199c82f703bd\" (UID: \"e2db42cd-0c43-41be-a881-199c82f703bd\") " Mar 10 21:55:01 crc kubenswrapper[4919]: I0310 21:55:01.907642 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-utilities" (OuterVolumeSpecName: "utilities") pod "e2db42cd-0c43-41be-a881-199c82f703bd" (UID: "e2db42cd-0c43-41be-a881-199c82f703bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:55:01 crc kubenswrapper[4919]: I0310 21:55:01.915056 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2db42cd-0c43-41be-a881-199c82f703bd-kube-api-access-wn87z" (OuterVolumeSpecName: "kube-api-access-wn87z") pod "e2db42cd-0c43-41be-a881-199c82f703bd" (UID: "e2db42cd-0c43-41be-a881-199c82f703bd"). InnerVolumeSpecName "kube-api-access-wn87z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.008448 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn87z\" (UniqueName: \"kubernetes.io/projected/e2db42cd-0c43-41be-a881-199c82f703bd-kube-api-access-wn87z\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.008824 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.044505 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2db42cd-0c43-41be-a881-199c82f703bd" (UID: "e2db42cd-0c43-41be-a881-199c82f703bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.109902 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2db42cd-0c43-41be-a881-199c82f703bd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.115896 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4l8gq" event={"ID":"e2db42cd-0c43-41be-a881-199c82f703bd","Type":"ContainerDied","Data":"8e130c7ff4578bb6de415aa1ab4f4c9cfac5a99971a54d041724826a529fd399"} Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.115997 4919 scope.go:117] "RemoveContainer" containerID="2b8fda52ce376caf909f426ee3617db20d9b419d6dabc7b1b492e41f452da589" Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.116067 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4l8gq" Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.141504 4919 scope.go:117] "RemoveContainer" containerID="5a8182ce24c07867f4629d23a7934b9e5bdc6e31ecb798c4cdbd2bb7f394aaa7" Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.166644 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4l8gq"] Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.171445 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4l8gq"] Mar 10 21:55:02 crc kubenswrapper[4919]: I0310 21:55:02.190150 4919 scope.go:117] "RemoveContainer" containerID="3959afa3c73c8cd3ff60bc04e866a4d6631d97110554248f6ff8df122b309484" Mar 10 21:55:03 crc kubenswrapper[4919]: I0310 21:55:03.489070 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" path="/var/lib/kubelet/pods/e2db42cd-0c43-41be-a881-199c82f703bd/volumes" Mar 10 21:55:04 crc kubenswrapper[4919]: I0310 21:55:04.727222 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz"] Mar 10 21:55:04 crc kubenswrapper[4919]: I0310 21:55:04.727475 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" podUID="58d4ac79-fba2-4226-94fd-2a255f79907c" containerName="controller-manager" containerID="cri-o://cba0b9d9a22f94fd14c49643fea816d363de842d4153edea17a6045188b2aa81" gracePeriod=30 Mar 10 21:55:04 crc kubenswrapper[4919]: I0310 21:55:04.735677 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr"] Mar 10 21:55:04 crc kubenswrapper[4919]: I0310 21:55:04.735883 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" podUID="22f3b36f-54ca-4eaf-bde1-c6f104b0a500" containerName="route-controller-manager" containerID="cri-o://964e1562e78301ab50d0185926b887f46260604c13cdfff52663268f8e43c298" gracePeriod=30 Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.140725 4919 generic.go:334] "Generic (PLEG): container finished" podID="22f3b36f-54ca-4eaf-bde1-c6f104b0a500" containerID="964e1562e78301ab50d0185926b887f46260604c13cdfff52663268f8e43c298" exitCode=0 Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.140811 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" event={"ID":"22f3b36f-54ca-4eaf-bde1-c6f104b0a500","Type":"ContainerDied","Data":"964e1562e78301ab50d0185926b887f46260604c13cdfff52663268f8e43c298"} Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.142179 4919 generic.go:334] "Generic (PLEG): container finished" podID="58d4ac79-fba2-4226-94fd-2a255f79907c" containerID="cba0b9d9a22f94fd14c49643fea816d363de842d4153edea17a6045188b2aa81" exitCode=0 Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.142203 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" event={"ID":"58d4ac79-fba2-4226-94fd-2a255f79907c","Type":"ContainerDied","Data":"cba0b9d9a22f94fd14c49643fea816d363de842d4153edea17a6045188b2aa81"} Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.243788 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.252372 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.351816 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-client-ca\") pod \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.351894 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-serving-cert\") pod \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.351941 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-config\") pod \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.351991 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9rdz\" (UniqueName: \"kubernetes.io/projected/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-kube-api-access-b9rdz\") pod \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\" (UID: \"22f3b36f-54ca-4eaf-bde1-c6f104b0a500\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.352941 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-config" (OuterVolumeSpecName: "config") pod "22f3b36f-54ca-4eaf-bde1-c6f104b0a500" (UID: "22f3b36f-54ca-4eaf-bde1-c6f104b0a500"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.353004 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-client-ca" (OuterVolumeSpecName: "client-ca") pod "22f3b36f-54ca-4eaf-bde1-c6f104b0a500" (UID: "22f3b36f-54ca-4eaf-bde1-c6f104b0a500"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.356839 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "22f3b36f-54ca-4eaf-bde1-c6f104b0a500" (UID: "22f3b36f-54ca-4eaf-bde1-c6f104b0a500"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.357039 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-kube-api-access-b9rdz" (OuterVolumeSpecName: "kube-api-access-b9rdz") pod "22f3b36f-54ca-4eaf-bde1-c6f104b0a500" (UID: "22f3b36f-54ca-4eaf-bde1-c6f104b0a500"). InnerVolumeSpecName "kube-api-access-b9rdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.454169 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-config\") pod \"58d4ac79-fba2-4226-94fd-2a255f79907c\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.454322 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-proxy-ca-bundles\") pod \"58d4ac79-fba2-4226-94fd-2a255f79907c\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.454463 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-client-ca\") pod \"58d4ac79-fba2-4226-94fd-2a255f79907c\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.454500 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58d4ac79-fba2-4226-94fd-2a255f79907c-serving-cert\") pod \"58d4ac79-fba2-4226-94fd-2a255f79907c\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.454563 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjt4\" (UniqueName: \"kubernetes.io/projected/58d4ac79-fba2-4226-94fd-2a255f79907c-kube-api-access-ktjt4\") pod \"58d4ac79-fba2-4226-94fd-2a255f79907c\" (UID: \"58d4ac79-fba2-4226-94fd-2a255f79907c\") " Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.454995 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.455019 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.455036 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.455054 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9rdz\" (UniqueName: \"kubernetes.io/projected/22f3b36f-54ca-4eaf-bde1-c6f104b0a500-kube-api-access-b9rdz\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.455303 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "58d4ac79-fba2-4226-94fd-2a255f79907c" (UID: "58d4ac79-fba2-4226-94fd-2a255f79907c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.455456 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-client-ca" (OuterVolumeSpecName: "client-ca") pod "58d4ac79-fba2-4226-94fd-2a255f79907c" (UID: "58d4ac79-fba2-4226-94fd-2a255f79907c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.455569 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-config" (OuterVolumeSpecName: "config") pod "58d4ac79-fba2-4226-94fd-2a255f79907c" (UID: "58d4ac79-fba2-4226-94fd-2a255f79907c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.458363 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d4ac79-fba2-4226-94fd-2a255f79907c-kube-api-access-ktjt4" (OuterVolumeSpecName: "kube-api-access-ktjt4") pod "58d4ac79-fba2-4226-94fd-2a255f79907c" (UID: "58d4ac79-fba2-4226-94fd-2a255f79907c"). InnerVolumeSpecName "kube-api-access-ktjt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.460027 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d4ac79-fba2-4226-94fd-2a255f79907c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58d4ac79-fba2-4226-94fd-2a255f79907c" (UID: "58d4ac79-fba2-4226-94fd-2a255f79907c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.556559 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.556596 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58d4ac79-fba2-4226-94fd-2a255f79907c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.556606 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjt4\" (UniqueName: \"kubernetes.io/projected/58d4ac79-fba2-4226-94fd-2a255f79907c-kube-api-access-ktjt4\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.556616 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.556625 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58d4ac79-fba2-4226-94fd-2a255f79907c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.849573 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-688cc6dd75-6whhm"] Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.849885 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f3b36f-54ca-4eaf-bde1-c6f104b0a500" containerName="route-controller-manager" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.849897 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f3b36f-54ca-4eaf-bde1-c6f104b0a500" containerName="route-controller-manager" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.849905 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.849911 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.849925 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" containerName="extract-content" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.849931 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" containerName="extract-content" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.849944 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="extract-utilities" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.849950 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="extract-utilities" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.849958 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" containerName="extract-utilities" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.849964 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" containerName="extract-utilities" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.849971 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerName="extract-content" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.849977 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerName="extract-content" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.849988 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.849994 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.850001 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850006 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.850014 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850020 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.850027 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d4ac79-fba2-4226-94fd-2a255f79907c" containerName="controller-manager" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850032 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d4ac79-fba2-4226-94fd-2a255f79907c" containerName="controller-manager" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.850042 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerName="extract-content" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850047 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerName="extract-content" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.850057 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerName="extract-utilities" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850062 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerName="extract-utilities" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.850070 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerName="extract-utilities" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850075 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerName="extract-utilities" Mar 10 21:55:05 crc kubenswrapper[4919]: E0310 21:55:05.850085 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="extract-content" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850090 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="extract-content" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850170 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c5f7639-6abe-4578-81f0-17691f1ad5ef" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850182 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d4ac79-fba2-4226-94fd-2a255f79907c" containerName="controller-manager" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850190 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f3b36f-54ca-4eaf-bde1-c6f104b0a500" containerName="route-controller-manager" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850201 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb39d6ee-8d9d-4cd6-811a-fbbb8d7e018c" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850211 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2db42cd-0c43-41be-a881-199c82f703bd" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850222 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a979e53c-0904-4fc0-9ef4-16706a351785" containerName="registry-server" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.850628 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.851579 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg"] Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.852523 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.861027 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-688cc6dd75-6whhm"] Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.905852 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg"] Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.962853 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-serving-cert\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.962902 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-client-ca\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.963092 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-client-ca\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.963189 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf7qd\" (UniqueName: \"kubernetes.io/projected/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-kube-api-access-hf7qd\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.963309 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-proxy-ca-bundles\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.963364 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-serving-cert\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.963482 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7f4k\" (UniqueName: \"kubernetes.io/projected/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-kube-api-access-g7f4k\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.963533 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-config\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:05 crc kubenswrapper[4919]: I0310 21:55:05.963564 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-config\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.065281 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-client-ca\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.065367 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf7qd\" (UniqueName: \"kubernetes.io/projected/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-kube-api-access-hf7qd\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.065489 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-proxy-ca-bundles\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.065531 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-serving-cert\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.065586 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7f4k\" (UniqueName: \"kubernetes.io/projected/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-kube-api-access-g7f4k\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.065625 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-config\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.066220 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-config\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.066510 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-serving-cert\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.066581 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-client-ca\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.066693 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-client-ca\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.067088 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-config\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.068038 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-config\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.068074 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-proxy-ca-bundles\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.068441 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-client-ca\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.070331 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-serving-cert\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.073548 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-serving-cert\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.083122 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf7qd\" (UniqueName: \"kubernetes.io/projected/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-kube-api-access-hf7qd\") pod \"route-controller-manager-7686d69b8c-dqlzg\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.083458 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7f4k\" (UniqueName: \"kubernetes.io/projected/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-kube-api-access-g7f4k\") pod \"controller-manager-688cc6dd75-6whhm\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.151617 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" event={"ID":"58d4ac79-fba2-4226-94fd-2a255f79907c","Type":"ContainerDied","Data":"d080163e986b34cd835b37dcbb7126850b5b2cd75f9bdb117722f7c1719baaee"} Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.151671 4919 scope.go:117] "RemoveContainer" containerID="cba0b9d9a22f94fd14c49643fea816d363de842d4153edea17a6045188b2aa81" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.151789 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.157052 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" event={"ID":"22f3b36f-54ca-4eaf-bde1-c6f104b0a500","Type":"ContainerDied","Data":"02cf89a9896b087d6a45ff51be59c4426d3ff5506c4fd826fa80ef5e396bce75"} Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.157155 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.176408 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz"] Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.180554 4919 scope.go:117] "RemoveContainer" containerID="964e1562e78301ab50d0185926b887f46260604c13cdfff52663268f8e43c298" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.186948 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7789f7fd8d-w9gvz"] Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.190725 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr"] Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.193278 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654c59c45d-m2rmr"] Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.216021 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.222550 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.510277 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m9qd4"] Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.617115 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg"] Mar 10 21:55:06 crc kubenswrapper[4919]: W0310 21:55:06.625696 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b4f0884_7217_47f3_9b14_8a26e4bd53f5.slice/crio-2f97b4d5912715e8fb5b92cb05994ca8a9099862f9dc3feba130ee5f9ce56b72 WatchSource:0}: Error finding container 2f97b4d5912715e8fb5b92cb05994ca8a9099862f9dc3feba130ee5f9ce56b72: Status 404 returned error can't find the container with id 2f97b4d5912715e8fb5b92cb05994ca8a9099862f9dc3feba130ee5f9ce56b72 Mar 10 21:55:06 crc kubenswrapper[4919]: I0310 21:55:06.666201 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-688cc6dd75-6whhm"] Mar 10 21:55:06 crc kubenswrapper[4919]: W0310 21:55:06.669530 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbbd09f7_cbd2_4f40_8f39_0c5c8478d5a7.slice/crio-ba33755af4c4d2688908387d678a2fdc6c32a7dfff9f150c02858521e0a87c89 WatchSource:0}: Error finding container ba33755af4c4d2688908387d678a2fdc6c32a7dfff9f150c02858521e0a87c89: Status 404 returned error can't find the container with id ba33755af4c4d2688908387d678a2fdc6c32a7dfff9f150c02858521e0a87c89 Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.163354 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" event={"ID":"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7","Type":"ContainerStarted","Data":"edc57210af5b3c0490d15f232a74495979fd1ce7fcd483c89f588ee7480aa745"} Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.163641 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" event={"ID":"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7","Type":"ContainerStarted","Data":"ba33755af4c4d2688908387d678a2fdc6c32a7dfff9f150c02858521e0a87c89"} Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.163662 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.166536 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" event={"ID":"2b4f0884-7217-47f3-9b14-8a26e4bd53f5","Type":"ContainerStarted","Data":"ac4240c2f725ae17de6ebf437d9db69694876fb23d7aff4b52c1c0190f576add"} Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.166565 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" event={"ID":"2b4f0884-7217-47f3-9b14-8a26e4bd53f5","Type":"ContainerStarted","Data":"2f97b4d5912715e8fb5b92cb05994ca8a9099862f9dc3feba130ee5f9ce56b72"} Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.166723 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.168373 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.174131 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.186866 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" podStartSLOduration=3.186848916 podStartE2EDuration="3.186848916s" podCreationTimestamp="2026-03-10 21:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:55:07.186804965 +0000 UTC m=+294.428685583" watchObservedRunningTime="2026-03-10 21:55:07.186848916 +0000 UTC m=+294.428729544" Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.202963 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" podStartSLOduration=3.20294347 podStartE2EDuration="3.20294347s" podCreationTimestamp="2026-03-10 21:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:55:07.200647492 +0000 UTC m=+294.442528100" watchObservedRunningTime="2026-03-10 21:55:07.20294347 +0000 UTC m=+294.444824078" Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.491822 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f3b36f-54ca-4eaf-bde1-c6f104b0a500" path="/var/lib/kubelet/pods/22f3b36f-54ca-4eaf-bde1-c6f104b0a500/volumes" Mar 10 21:55:07 crc kubenswrapper[4919]: I0310 21:55:07.492930 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d4ac79-fba2-4226-94fd-2a255f79907c" path="/var/lib/kubelet/pods/58d4ac79-fba2-4226-94fd-2a255f79907c/volumes" Mar 10 21:55:24 crc kubenswrapper[4919]: I0310 21:55:24.747268 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-688cc6dd75-6whhm"] Mar 10 21:55:24 crc kubenswrapper[4919]: I0310 21:55:24.748194 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" containerName="controller-manager" containerID="cri-o://edc57210af5b3c0490d15f232a74495979fd1ce7fcd483c89f588ee7480aa745" gracePeriod=30 Mar 10 21:55:24 crc kubenswrapper[4919]: I0310 21:55:24.839875 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg"] Mar 10 21:55:24 crc kubenswrapper[4919]: I0310 21:55:24.840124 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" containerName="route-controller-manager" containerID="cri-o://ac4240c2f725ae17de6ebf437d9db69694876fb23d7aff4b52c1c0190f576add" gracePeriod=30 Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.288061 4919 generic.go:334] "Generic (PLEG): container finished" podID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" containerID="ac4240c2f725ae17de6ebf437d9db69694876fb23d7aff4b52c1c0190f576add" exitCode=0 Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.288185 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" event={"ID":"2b4f0884-7217-47f3-9b14-8a26e4bd53f5","Type":"ContainerDied","Data":"ac4240c2f725ae17de6ebf437d9db69694876fb23d7aff4b52c1c0190f576add"} Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.290126 4919 generic.go:334] "Generic (PLEG): container finished" podID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" containerID="edc57210af5b3c0490d15f232a74495979fd1ce7fcd483c89f588ee7480aa745" exitCode=0 Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.290170 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" event={"ID":"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7","Type":"ContainerDied","Data":"edc57210af5b3c0490d15f232a74495979fd1ce7fcd483c89f588ee7480aa745"} Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.346873 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.352664 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.512444 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-client-ca\") pod \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.512484 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf7qd\" (UniqueName: \"kubernetes.io/projected/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-kube-api-access-hf7qd\") pod \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.512542 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-config\") pod \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.512581 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-serving-cert\") pod \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.512686 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-serving-cert\") pod \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.513081 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-config\") pod \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.513175 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-client-ca\") pod \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\" (UID: \"2b4f0884-7217-47f3-9b14-8a26e4bd53f5\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.513238 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7f4k\" (UniqueName: \"kubernetes.io/projected/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-kube-api-access-g7f4k\") pod \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.513276 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-proxy-ca-bundles\") pod \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\" (UID: \"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7\") " Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.513450 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" (UID: "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.513880 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-client-ca" (OuterVolumeSpecName: "client-ca") pod "2b4f0884-7217-47f3-9b14-8a26e4bd53f5" (UID: "2b4f0884-7217-47f3-9b14-8a26e4bd53f5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.514152 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" (UID: "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.514172 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-config" (OuterVolumeSpecName: "config") pod "2b4f0884-7217-47f3-9b14-8a26e4bd53f5" (UID: "2b4f0884-7217-47f3-9b14-8a26e4bd53f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.514649 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-config" (OuterVolumeSpecName: "config") pod "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" (UID: "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.517628 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" (UID: "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.517763 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-kube-api-access-hf7qd" (OuterVolumeSpecName: "kube-api-access-hf7qd") pod "2b4f0884-7217-47f3-9b14-8a26e4bd53f5" (UID: "2b4f0884-7217-47f3-9b14-8a26e4bd53f5"). InnerVolumeSpecName "kube-api-access-hf7qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.517864 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-kube-api-access-g7f4k" (OuterVolumeSpecName: "kube-api-access-g7f4k") pod "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" (UID: "dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7"). InnerVolumeSpecName "kube-api-access-g7f4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.518654 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2b4f0884-7217-47f3-9b14-8a26e4bd53f5" (UID: "2b4f0884-7217-47f3-9b14-8a26e4bd53f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.584973 4919 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.585242 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" containerName="controller-manager" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.585263 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" containerName="controller-manager" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.585277 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" containerName="route-controller-manager" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.585286 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" containerName="route-controller-manager" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.585429 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" containerName="controller-manager" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.585455 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" containerName="route-controller-manager" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.586585 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.587516 4919 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.587853 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3" gracePeriod=15 Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588022 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a" gracePeriod=15 Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588066 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd" gracePeriod=15 Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588056 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0" gracePeriod=15 Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588137 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473" gracePeriod=15 Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588471 4919 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.588676 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588690 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.588698 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588705 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.588719 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588724 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.588732 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588739 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.588746 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588752 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.588760 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588765 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.588774 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588780 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.588788 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588793 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588875 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588885 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588897 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588904 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588913 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588920 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.588926 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.589016 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.589025 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.589032 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.589038 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.589124 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.589135 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.618900 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.618929 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.618955 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.618978 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619032 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619054 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619078 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619107 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619140 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619151 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7f4k\" (UniqueName: \"kubernetes.io/projected/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-kube-api-access-g7f4k\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619162 4919 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619171 4919 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619180 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf7qd\" (UniqueName: \"kubernetes.io/projected/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-kube-api-access-hf7qd\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619189 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619197 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619206 4919 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.619214 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4f0884-7217-47f3-9b14-8a26e4bd53f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.643353 4919 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728587 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728646 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728685 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728705 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728733 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728759 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728806 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728827 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.728877 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.729075 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.729105 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.729134 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.729158 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.729180 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.729199 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.729219 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.753567 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-poddf1d5909_9408_4ac0_a762_5298b301ae59.slice/crio-bd1de06a2a06203a5013c9e513925456cdfb0378e2f5fd7e5515cc0ababe9a26.scope\": RecentStats: unable to find data in memory cache]" Mar 10 21:55:25 crc kubenswrapper[4919]: I0310 21:55:25.944105 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:25 crc kubenswrapper[4919]: W0310 21:55:25.962073 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-28450a27abf191278d5d2f5f63862a4d6625a3bd57f59e659c70794ad6836886 WatchSource:0}: Error finding container 28450a27abf191278d5d2f5f63862a4d6625a3bd57f59e659c70794ad6836886: Status 404 returned error can't find the container with id 28450a27abf191278d5d2f5f63862a4d6625a3bd57f59e659c70794ad6836886 Mar 10 21:55:25 crc kubenswrapper[4919]: E0310 21:55:25.964637 4919 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b9990e9005645 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:55:25.963875909 +0000 UTC m=+313.205756517,LastTimestamp:2026-03-10 21:55:25.963875909 +0000 UTC m=+313.205756517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.296197 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.296380 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" event={"ID":"dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7","Type":"ContainerDied","Data":"ba33755af4c4d2688908387d678a2fdc6c32a7dfff9f150c02858521e0a87c89"} Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.296692 4919 scope.go:117] "RemoveContainer" containerID="edc57210af5b3c0490d15f232a74495979fd1ce7fcd483c89f588ee7480aa745" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.297443 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.297633 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.298527 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c"} Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.298553 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"28450a27abf191278d5d2f5f63862a4d6625a3bd57f59e659c70794ad6836886"} Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.299218 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: E0310 21:55:26.299307 4919 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.299430 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.301522 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" event={"ID":"2b4f0884-7217-47f3-9b14-8a26e4bd53f5","Type":"ContainerDied","Data":"2f97b4d5912715e8fb5b92cb05994ca8a9099862f9dc3feba130ee5f9ce56b72"} Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.301592 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.302451 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.302804 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.303061 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.305421 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.306682 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.307648 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a" exitCode=0 Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.307666 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473" exitCode=0 Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.307674 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0" exitCode=0 Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.307681 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd" exitCode=2 Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.308941 4919 generic.go:334] "Generic (PLEG): container finished" podID="df1d5909-9408-4ac0-a762-5298b301ae59" containerID="bd1de06a2a06203a5013c9e513925456cdfb0378e2f5fd7e5515cc0ababe9a26" exitCode=0 Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.308965 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"df1d5909-9408-4ac0-a762-5298b301ae59","Type":"ContainerDied","Data":"bd1de06a2a06203a5013c9e513925456cdfb0378e2f5fd7e5515cc0ababe9a26"} Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.309263 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.309478 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.309758 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.309941 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.312790 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.313163 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.313448 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.313641 4919 scope.go:117] "RemoveContainer" containerID="ac4240c2f725ae17de6ebf437d9db69694876fb23d7aff4b52c1c0190f576add" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.313720 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.315779 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.316043 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.316296 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.316568 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: I0310 21:55:26.327292 4919 scope.go:117] "RemoveContainer" containerID="5b2adcae0bf01d646b97b915a28921ad4151ca62e8bdd174b42b5e3dff4b27db" Mar 10 21:55:26 crc kubenswrapper[4919]: E0310 21:55:26.729979 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:55:26Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:55:26Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:55:26Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T21:55:26Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: E0310 21:55:26.730460 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: E0310 21:55:26.730741 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: E0310 21:55:26.730974 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: E0310 21:55:26.731217 4919 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:26 crc kubenswrapper[4919]: E0310 21:55:26.731243 4919 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.318530 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.592017 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.592706 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.593244 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.593643 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.761641 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df1d5909-9408-4ac0-a762-5298b301ae59-kube-api-access\") pod \"df1d5909-9408-4ac0-a762-5298b301ae59\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.761974 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-kubelet-dir\") pod \"df1d5909-9408-4ac0-a762-5298b301ae59\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.762024 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-var-lock\") pod \"df1d5909-9408-4ac0-a762-5298b301ae59\" (UID: \"df1d5909-9408-4ac0-a762-5298b301ae59\") " Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.762234 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-var-lock" (OuterVolumeSpecName: "var-lock") pod "df1d5909-9408-4ac0-a762-5298b301ae59" (UID: "df1d5909-9408-4ac0-a762-5298b301ae59"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.762261 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "df1d5909-9408-4ac0-a762-5298b301ae59" (UID: "df1d5909-9408-4ac0-a762-5298b301ae59"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.766581 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df1d5909-9408-4ac0-a762-5298b301ae59-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "df1d5909-9408-4ac0-a762-5298b301ae59" (UID: "df1d5909-9408-4ac0-a762-5298b301ae59"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.862983 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df1d5909-9408-4ac0-a762-5298b301ae59-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.863176 4919 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.863236 4919 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df1d5909-9408-4ac0-a762-5298b301ae59-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.886386 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.887269 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.887857 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.888324 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.888604 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.888874 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964052 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964125 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964202 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964446 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964517 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964544 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964806 4919 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964829 4919 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:27 crc kubenswrapper[4919]: I0310 21:55:27.964843 4919 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.332126 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.333470 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3" exitCode=0 Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.333609 4919 scope.go:117] "RemoveContainer" containerID="ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.333632 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.337009 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"df1d5909-9408-4ac0-a762-5298b301ae59","Type":"ContainerDied","Data":"d61e923efa60cc26c36f6f8896ffa2d064505f57043c37b69061f529c5a0bc7b"} Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.337066 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.337081 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d61e923efa60cc26c36f6f8896ffa2d064505f57043c37b69061f529c5a0bc7b" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.352543 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.353116 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.353552 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.354048 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.361987 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.363640 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.364226 4919 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.364942 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.365036 4919 scope.go:117] "RemoveContainer" containerID="47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.386020 4919 scope.go:117] "RemoveContainer" containerID="37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.407070 4919 scope.go:117] "RemoveContainer" containerID="dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.424140 4919 scope.go:117] "RemoveContainer" containerID="d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.455222 4919 scope.go:117] "RemoveContainer" containerID="15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.481264 4919 scope.go:117] "RemoveContainer" containerID="ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a" Mar 10 21:55:28 crc kubenswrapper[4919]: E0310 21:55:28.481828 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\": container with ID starting with ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a not found: ID does not exist" containerID="ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.481878 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a"} err="failed to get container status \"ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\": rpc error: code = NotFound desc = could not find container \"ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a\": container with ID starting with ce192a4f3e94d00998fbfe0948a32765574a9261d22004480dfb54b9bbf9407a not found: ID does not exist" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.481909 4919 scope.go:117] "RemoveContainer" containerID="47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473" Mar 10 21:55:28 crc kubenswrapper[4919]: E0310 21:55:28.482716 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\": container with ID starting with 47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473 not found: ID does not exist" containerID="47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.482796 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473"} err="failed to get container status \"47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\": rpc error: code = NotFound desc = could not find container \"47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473\": container with ID starting with 47a772db349df6c0c6fe27be93d19e02d66cfaf9739ee12e89730ece1da11473 not found: ID does not exist" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.482835 4919 scope.go:117] "RemoveContainer" containerID="37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0" Mar 10 21:55:28 crc kubenswrapper[4919]: E0310 21:55:28.483342 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\": container with ID starting with 37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0 not found: ID does not exist" containerID="37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.483421 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0"} err="failed to get container status \"37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\": rpc error: code = NotFound desc = could not find container \"37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0\": container with ID starting with 37d8507fd02b92972ed41aa2c4d53fceb1c9d58864e46ddc7991f94fb4d9b3e0 not found: ID does not exist" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.483453 4919 scope.go:117] "RemoveContainer" containerID="dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd" Mar 10 21:55:28 crc kubenswrapper[4919]: E0310 21:55:28.483908 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\": container with ID starting with dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd not found: ID does not exist" containerID="dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.483992 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd"} err="failed to get container status \"dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\": rpc error: code = NotFound desc = could not find container \"dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd\": container with ID starting with dfb03c5f450790952fc7173bc2a6d723c777921f5f74963bfdbc3573ec1d21cd not found: ID does not exist" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.484049 4919 scope.go:117] "RemoveContainer" containerID="d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3" Mar 10 21:55:28 crc kubenswrapper[4919]: E0310 21:55:28.484691 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\": container with ID starting with d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3 not found: ID does not exist" containerID="d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.484741 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3"} err="failed to get container status \"d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\": rpc error: code = NotFound desc = could not find container \"d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3\": container with ID starting with d9e6a8efa1e2d16b45fe6362b326e3f89333864dc74f3b298d2e500a90d303b3 not found: ID does not exist" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.484770 4919 scope.go:117] "RemoveContainer" containerID="15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0" Mar 10 21:55:28 crc kubenswrapper[4919]: E0310 21:55:28.485158 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\": container with ID starting with 15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0 not found: ID does not exist" containerID="15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0" Mar 10 21:55:28 crc kubenswrapper[4919]: I0310 21:55:28.485200 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0"} err="failed to get container status \"15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\": rpc error: code = NotFound desc = could not find container \"15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0\": container with ID starting with 15b81f8be7635a10cf516cd17c9ab3d48b74d3421098124fcc47dbab6691e3c0 not found: ID does not exist" Mar 10 21:55:29 crc kubenswrapper[4919]: I0310 21:55:29.491729 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 21:55:31 crc kubenswrapper[4919]: I0310 21:55:31.535018 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" containerName="oauth-openshift" containerID="cri-o://3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e" gracePeriod=15 Mar 10 21:55:31 crc kubenswrapper[4919]: I0310 21:55:31.976120 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:55:31 crc kubenswrapper[4919]: I0310 21:55:31.979326 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:31 crc kubenswrapper[4919]: I0310 21:55:31.979769 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:31 crc kubenswrapper[4919]: I0310 21:55:31.980218 4919 status_manager.go:851] "Failed to get status for pod" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m9qd4\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:31 crc kubenswrapper[4919]: I0310 21:55:31.980515 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067362 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-provider-selection\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067439 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-cliconfig\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067471 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-trusted-ca-bundle\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067504 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mbw4\" (UniqueName: \"kubernetes.io/projected/b00c04d4-1287-409a-8e67-2edb888bf832-kube-api-access-9mbw4\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067527 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-serving-cert\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067555 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b00c04d4-1287-409a-8e67-2edb888bf832-audit-dir\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067590 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-service-ca\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067624 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-ocp-branding-template\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067670 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-error\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.067710 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-session\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.068873 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-router-certs\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.068890 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.068453 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b00c04d4-1287-409a-8e67-2edb888bf832-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.068936 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-audit-policies\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.068970 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-login\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.069016 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-idp-0-file-data\") pod \"b00c04d4-1287-409a-8e67-2edb888bf832\" (UID: \"b00c04d4-1287-409a-8e67-2edb888bf832\") " Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.069618 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.069626 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.069992 4919 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.070022 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.070035 4919 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b00c04d4-1287-409a-8e67-2edb888bf832-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.070048 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.070941 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.073775 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.075549 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.075777 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00c04d4-1287-409a-8e67-2edb888bf832-kube-api-access-9mbw4" (OuterVolumeSpecName: "kube-api-access-9mbw4") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "kube-api-access-9mbw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.076931 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.077369 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.078436 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.080029 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.080305 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.080872 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b00c04d4-1287-409a-8e67-2edb888bf832" (UID: "b00c04d4-1287-409a-8e67-2edb888bf832"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171725 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171776 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171795 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mbw4\" (UniqueName: \"kubernetes.io/projected/b00c04d4-1287-409a-8e67-2edb888bf832-kube-api-access-9mbw4\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171811 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171830 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171846 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171863 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171879 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171928 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.171946 4919 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b00c04d4-1287-409a-8e67-2edb888bf832-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.370875 4919 generic.go:334] "Generic (PLEG): container finished" podID="b00c04d4-1287-409a-8e67-2edb888bf832" containerID="3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e" exitCode=0 Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.370936 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" event={"ID":"b00c04d4-1287-409a-8e67-2edb888bf832","Type":"ContainerDied","Data":"3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e"} Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.370973 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" event={"ID":"b00c04d4-1287-409a-8e67-2edb888bf832","Type":"ContainerDied","Data":"ce5d5f8dcf9a26afe7efcd91551d2cc675f135275680a3d9f398dbd0932c21ac"} Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.370992 4919 scope.go:117] "RemoveContainer" containerID="3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.370999 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.371756 4919 status_manager.go:851] "Failed to get status for pod" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m9qd4\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.372256 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.372820 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.373316 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.393171 4919 status_manager.go:851] "Failed to get status for pod" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m9qd4\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.393499 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.393755 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.394043 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.397777 4919 scope.go:117] "RemoveContainer" containerID="3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e" Mar 10 21:55:32 crc kubenswrapper[4919]: E0310 21:55:32.398320 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e\": container with ID starting with 3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e not found: ID does not exist" containerID="3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e" Mar 10 21:55:32 crc kubenswrapper[4919]: I0310 21:55:32.398356 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e"} err="failed to get container status \"3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e\": rpc error: code = NotFound desc = could not find container \"3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e\": container with ID starting with 3182fbba7215923d1c2dd9566bc190ce9c0ce691eebc62e48da0e1b854745d0e not found: ID does not exist" Mar 10 21:55:33 crc kubenswrapper[4919]: I0310 21:55:33.485091 4919 status_manager.go:851] "Failed to get status for pod" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m9qd4\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:33 crc kubenswrapper[4919]: I0310 21:55:33.485826 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:33 crc kubenswrapper[4919]: I0310 21:55:33.486515 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:33 crc kubenswrapper[4919]: I0310 21:55:33.487126 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:33 crc kubenswrapper[4919]: E0310 21:55:33.864209 4919 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b9990e9005645 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 21:55:25.963875909 +0000 UTC m=+313.205756517,LastTimestamp:2026-03-10 21:55:25.963875909 +0000 UTC m=+313.205756517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 21:55:35 crc kubenswrapper[4919]: E0310 21:55:35.294476 4919 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:35 crc kubenswrapper[4919]: E0310 21:55:35.295319 4919 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:35 crc kubenswrapper[4919]: E0310 21:55:35.295971 4919 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:35 crc kubenswrapper[4919]: E0310 21:55:35.296490 4919 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:35 crc kubenswrapper[4919]: E0310 21:55:35.296949 4919 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:35 crc kubenswrapper[4919]: I0310 21:55:35.296992 4919 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 21:55:35 crc kubenswrapper[4919]: E0310 21:55:35.297421 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="200ms" Mar 10 21:55:35 crc kubenswrapper[4919]: E0310 21:55:35.498246 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="400ms" Mar 10 21:55:35 crc kubenswrapper[4919]: E0310 21:55:35.899316 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="800ms" Mar 10 21:55:36 crc kubenswrapper[4919]: E0310 21:55:36.700849 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="1.6s" Mar 10 21:55:38 crc kubenswrapper[4919]: E0310 21:55:38.302579 4919 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="3.2s" Mar 10 21:55:38 crc kubenswrapper[4919]: I0310 21:55:38.479621 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:38 crc kubenswrapper[4919]: I0310 21:55:38.480913 4919 status_manager.go:851] "Failed to get status for pod" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m9qd4\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:38 crc kubenswrapper[4919]: I0310 21:55:38.481813 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:38 crc kubenswrapper[4919]: I0310 21:55:38.482293 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:38 crc kubenswrapper[4919]: I0310 21:55:38.482839 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:38 crc kubenswrapper[4919]: I0310 21:55:38.504827 4919 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:38 crc kubenswrapper[4919]: I0310 21:55:38.505293 4919 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:38 crc kubenswrapper[4919]: E0310 21:55:38.506022 4919 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:38 crc kubenswrapper[4919]: I0310 21:55:38.506947 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:38 crc kubenswrapper[4919]: E0310 21:55:38.512511 4919 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" volumeName="registry-storage" Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.431774 4919 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="30ea75cdde742e361fe46bc365b4eade41dc02fba44878c54b2244e8ae37d1c7" exitCode=0 Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.431879 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"30ea75cdde742e361fe46bc365b4eade41dc02fba44878c54b2244e8ae37d1c7"} Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.432146 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a8a99b1dbdb477ad5f59d332497e2c5903b4bae32474c70b357475ebd28a071d"} Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.432610 4919 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.432639 4919 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.433213 4919 status_manager.go:851] "Failed to get status for pod" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" pod="openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7686d69b8c-dqlzg\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:39 crc kubenswrapper[4919]: E0310 21:55:39.433535 4919 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.433784 4919 status_manager.go:851] "Failed to get status for pod" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" pod="openshift-controller-manager/controller-manager-688cc6dd75-6whhm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-688cc6dd75-6whhm\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.434455 4919 status_manager.go:851] "Failed to get status for pod" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" pod="openshift-authentication/oauth-openshift-558db77b4-m9qd4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-m9qd4\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:39 crc kubenswrapper[4919]: I0310 21:55:39.434991 4919 status_manager.go:851] "Failed to get status for pod" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 10 21:55:40 crc kubenswrapper[4919]: I0310 21:55:40.460930 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c0f4e7729d88e7485289c6b87695f486d7c8483cfd04f5ced48b71a373a073bc"} Mar 10 21:55:40 crc kubenswrapper[4919]: I0310 21:55:40.460982 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8aeebe413c1f10a8c1ab62eb449615da091e806fc1dacf4b5fe9cfe3a30573bf"} Mar 10 21:55:40 crc kubenswrapper[4919]: I0310 21:55:40.460995 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"42d864666a1093f1cb46579b295b0e2958c887f1fd305154b99d636883b47e71"} Mar 10 21:55:40 crc kubenswrapper[4919]: I0310 21:55:40.467743 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 21:55:40 crc kubenswrapper[4919]: I0310 21:55:40.469796 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 21:55:40 crc kubenswrapper[4919]: I0310 21:55:40.469856 4919 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd" exitCode=1 Mar 10 21:55:40 crc kubenswrapper[4919]: I0310 21:55:40.469893 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd"} Mar 10 21:55:40 crc kubenswrapper[4919]: I0310 21:55:40.470449 4919 scope.go:117] "RemoveContainer" containerID="e3630a7f175a3275eff39088c20eafd059b205f0ccb36cbba2f09b77468963cd" Mar 10 21:55:41 crc kubenswrapper[4919]: I0310 21:55:41.479382 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 21:55:41 crc kubenswrapper[4919]: I0310 21:55:41.483203 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 21:55:41 crc kubenswrapper[4919]: I0310 21:55:41.489546 4919 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:41 crc kubenswrapper[4919]: I0310 21:55:41.489604 4919 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:41 crc kubenswrapper[4919]: I0310 21:55:41.491979 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d752bf1ae4e938f6d37f73e35ab60e87a7704e6ab3b9baca9810db9206467d2e"} Mar 10 21:55:41 crc kubenswrapper[4919]: I0310 21:55:41.492076 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:41 crc kubenswrapper[4919]: I0310 21:55:41.492109 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"839c66092751298ac6cd89f17fb3b188ef3cfb7f22222a92aa2f9923ed0a3a8d"} Mar 10 21:55:41 crc kubenswrapper[4919]: I0310 21:55:41.492136 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bfaf771f9d73813981c6d8f007367992dbcc78dc9332e3cb86a0c92b6156398d"} Mar 10 21:55:43 crc kubenswrapper[4919]: I0310 21:55:43.508033 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:43 crc kubenswrapper[4919]: I0310 21:55:43.508762 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:43 crc kubenswrapper[4919]: I0310 21:55:43.516843 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:43 crc kubenswrapper[4919]: I0310 21:55:43.678367 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:55:43 crc kubenswrapper[4919]: I0310 21:55:43.682651 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:55:44 crc kubenswrapper[4919]: I0310 21:55:44.508647 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:55:46 crc kubenswrapper[4919]: I0310 21:55:46.628033 4919 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:46 crc kubenswrapper[4919]: I0310 21:55:46.801791 4919 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fe152b3d-fa54-46eb-a57f-292c2340eadf" Mar 10 21:55:47 crc kubenswrapper[4919]: I0310 21:55:47.532882 4919 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:47 crc kubenswrapper[4919]: I0310 21:55:47.532920 4919 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:47 crc kubenswrapper[4919]: I0310 21:55:47.537791 4919 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fe152b3d-fa54-46eb-a57f-292c2340eadf" Mar 10 21:55:47 crc kubenswrapper[4919]: I0310 21:55:47.545529 4919 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://42d864666a1093f1cb46579b295b0e2958c887f1fd305154b99d636883b47e71" Mar 10 21:55:47 crc kubenswrapper[4919]: I0310 21:55:47.545578 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.510304 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.510360 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.510384 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.510487 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.512580 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.513864 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.514331 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.522231 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.522971 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.533722 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.534916 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.540792 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.542758 4919 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.542821 4919 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.550144 4919 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fe152b3d-fa54-46eb-a57f-292c2340eadf" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.696804 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.707302 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:55:48 crc kubenswrapper[4919]: I0310 21:55:48.717610 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 21:55:49 crc kubenswrapper[4919]: W0310 21:55:49.304810 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-126d45b99c216331d1070802f2b4d2cd9619fb9f90eb21d74eef9495b574b282 WatchSource:0}: Error finding container 126d45b99c216331d1070802f2b4d2cd9619fb9f90eb21d74eef9495b574b282: Status 404 returned error can't find the container with id 126d45b99c216331d1070802f2b4d2cd9619fb9f90eb21d74eef9495b574b282 Mar 10 21:55:49 crc kubenswrapper[4919]: I0310 21:55:49.553040 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5477db4f8d882dc3f338b7e9b265760fa68bd96898a79c5d34c9f8f713295818"} Mar 10 21:55:49 crc kubenswrapper[4919]: I0310 21:55:49.553126 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"126d45b99c216331d1070802f2b4d2cd9619fb9f90eb21d74eef9495b574b282"} Mar 10 21:55:49 crc kubenswrapper[4919]: I0310 21:55:49.558856 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7a06c419f57bdbc717d9f708aa45c42c0fd9386060d310546a341c673f5b3867"} Mar 10 21:55:49 crc kubenswrapper[4919]: I0310 21:55:49.558945 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cf77358d0c71ad6665b0dbaa8f40cf8554b5892615e4999aa8a57bb919faf2f2"} Mar 10 21:55:49 crc kubenswrapper[4919]: I0310 21:55:49.559208 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:55:49 crc kubenswrapper[4919]: I0310 21:55:49.560937 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6cf58d4f01c839ea973c3c70ec404309e328d1fb6ef11b5cf3907d4dc7dc7d8f"} Mar 10 21:55:49 crc kubenswrapper[4919]: I0310 21:55:49.561060 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e2d4ff06832a2d020b22193e4514d34d6f95ff926433257795b938c2652cb5ac"} Mar 10 21:55:50 crc kubenswrapper[4919]: I0310 21:55:50.571627 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 21:55:50 crc kubenswrapper[4919]: I0310 21:55:50.571998 4919 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="6cf58d4f01c839ea973c3c70ec404309e328d1fb6ef11b5cf3907d4dc7dc7d8f" exitCode=255 Mar 10 21:55:50 crc kubenswrapper[4919]: I0310 21:55:50.572112 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"6cf58d4f01c839ea973c3c70ec404309e328d1fb6ef11b5cf3907d4dc7dc7d8f"} Mar 10 21:55:50 crc kubenswrapper[4919]: I0310 21:55:50.572825 4919 scope.go:117] "RemoveContainer" containerID="6cf58d4f01c839ea973c3c70ec404309e328d1fb6ef11b5cf3907d4dc7dc7d8f" Mar 10 21:55:51 crc kubenswrapper[4919]: I0310 21:55:51.583174 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 21:55:51 crc kubenswrapper[4919]: I0310 21:55:51.583284 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cc5b3d1057a414afa5a04ecb72df782026e24d398a00e3d8d1249c042fceeea7"} Mar 10 21:55:52 crc kubenswrapper[4919]: I0310 21:55:52.594054 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 21:55:52 crc kubenswrapper[4919]: I0310 21:55:52.595460 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 21:55:52 crc kubenswrapper[4919]: I0310 21:55:52.595626 4919 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="cc5b3d1057a414afa5a04ecb72df782026e24d398a00e3d8d1249c042fceeea7" exitCode=255 Mar 10 21:55:52 crc kubenswrapper[4919]: I0310 21:55:52.595727 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"cc5b3d1057a414afa5a04ecb72df782026e24d398a00e3d8d1249c042fceeea7"} Mar 10 21:55:52 crc kubenswrapper[4919]: I0310 21:55:52.595863 4919 scope.go:117] "RemoveContainer" containerID="6cf58d4f01c839ea973c3c70ec404309e328d1fb6ef11b5cf3907d4dc7dc7d8f" Mar 10 21:55:52 crc kubenswrapper[4919]: I0310 21:55:52.596332 4919 scope.go:117] "RemoveContainer" containerID="cc5b3d1057a414afa5a04ecb72df782026e24d398a00e3d8d1249c042fceeea7" Mar 10 21:55:52 crc kubenswrapper[4919]: E0310 21:55:52.596667 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:55:53 crc kubenswrapper[4919]: I0310 21:55:53.604703 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 21:55:53 crc kubenswrapper[4919]: I0310 21:55:53.605234 4919 scope.go:117] "RemoveContainer" containerID="cc5b3d1057a414afa5a04ecb72df782026e24d398a00e3d8d1249c042fceeea7" Mar 10 21:55:53 crc kubenswrapper[4919]: E0310 21:55:53.605575 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 21:55:55 crc kubenswrapper[4919]: I0310 21:55:55.319034 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 21:55:56 crc kubenswrapper[4919]: I0310 21:55:56.744881 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 21:55:56 crc kubenswrapper[4919]: I0310 21:55:56.790766 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 21:55:56 crc kubenswrapper[4919]: I0310 21:55:56.797545 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 21:55:57 crc kubenswrapper[4919]: I0310 21:55:57.210199 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 21:55:57 crc kubenswrapper[4919]: I0310 21:55:57.678131 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 21:55:57 crc kubenswrapper[4919]: I0310 21:55:57.794494 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 21:55:57 crc kubenswrapper[4919]: I0310 21:55:57.865192 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.056007 4919 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.063994 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-route-controller-manager/route-controller-manager-7686d69b8c-dqlzg","openshift-controller-manager/controller-manager-688cc6dd75-6whhm","openshift-authentication/oauth-openshift-558db77b4-m9qd4"] Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.064095 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l","openshift-controller-manager/controller-manager-76b97b95d9-fmpzq"] Mar 10 21:55:58 crc kubenswrapper[4919]: E0310 21:55:58.064384 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" containerName="oauth-openshift" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.064469 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" containerName="oauth-openshift" Mar 10 21:55:58 crc kubenswrapper[4919]: E0310 21:55:58.064500 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" containerName="installer" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.064515 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" containerName="installer" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.064962 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" containerName="oauth-openshift" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.065053 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="df1d5909-9408-4ac0-a762-5298b301ae59" containerName="installer" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.065713 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.065779 4919 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.065848 4919 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f9ed1501-15da-4419-aa12-171e610438d6" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.066131 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.072234 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.072623 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.076451 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.076528 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.076568 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.076581 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.076697 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.076993 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.079183 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.079539 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.080160 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.082619 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.082718 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.086465 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.119259 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.119242525 podStartE2EDuration="12.119242525s" podCreationTimestamp="2026-03-10 21:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:55:58.116602003 +0000 UTC m=+345.358482631" watchObservedRunningTime="2026-03-10 21:55:58.119242525 +0000 UTC m=+345.361123133" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.152774 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-serving-cert\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.152863 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-client-ca\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.152906 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-config\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.152933 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-config\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.152963 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-serving-cert\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.153015 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jltxk\" (UniqueName: \"kubernetes.io/projected/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-kube-api-access-jltxk\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.153164 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-proxy-ca-bundles\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.153243 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlvt5\" (UniqueName: \"kubernetes.io/projected/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-kube-api-access-rlvt5\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.153269 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-client-ca\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.254276 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-client-ca\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.254339 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-config\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.254362 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-config\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.254384 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-serving-cert\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.254445 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jltxk\" (UniqueName: \"kubernetes.io/projected/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-kube-api-access-jltxk\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.255246 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-proxy-ca-bundles\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.255414 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-client-ca\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.255386 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlvt5\" (UniqueName: \"kubernetes.io/projected/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-kube-api-access-rlvt5\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.255478 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-client-ca\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.255519 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-serving-cert\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.256373 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-config\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.257962 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-proxy-ca-bundles\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.258201 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-client-ca\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.259253 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-config\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.269069 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-serving-cert\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.269244 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-serving-cert\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.274352 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jltxk\" (UniqueName: \"kubernetes.io/projected/c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1-kube-api-access-jltxk\") pod \"controller-manager-76b97b95d9-fmpzq\" (UID: \"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1\") " pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.281659 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlvt5\" (UniqueName: \"kubernetes.io/projected/e98e5bbe-c1d3-415d-85f6-daa1867d6a95-kube-api-access-rlvt5\") pod \"route-controller-manager-74785c5695-nzs2l\" (UID: \"e98e5bbe-c1d3-415d-85f6-daa1867d6a95\") " pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.370548 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.393518 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.403191 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.734814 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.822480 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 21:55:58 crc kubenswrapper[4919]: I0310 21:55:58.958635 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.300926 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.338427 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.431596 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.464503 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76b97b95d9-fmpzq"] Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.471273 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l"] Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.493142 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4f0884-7217-47f3-9b14-8a26e4bd53f5" path="/var/lib/kubelet/pods/2b4f0884-7217-47f3-9b14-8a26e4bd53f5/volumes" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.494519 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b00c04d4-1287-409a-8e67-2edb888bf832" path="/var/lib/kubelet/pods/b00c04d4-1287-409a-8e67-2edb888bf832/volumes" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.499228 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7" path="/var/lib/kubelet/pods/dbbd09f7-cbd2-4f40-8f39-0c5c8478d5a7/volumes" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.640679 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.719737 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l"] Mar 10 21:55:59 crc kubenswrapper[4919]: W0310 21:55:59.727372 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode98e5bbe_c1d3_415d_85f6_daa1867d6a95.slice/crio-fc0109b40c70e3377301bc899a21e20e15116221825cd0d472950d5a2ac4f5ec WatchSource:0}: Error finding container fc0109b40c70e3377301bc899a21e20e15116221825cd0d472950d5a2ac4f5ec: Status 404 returned error can't find the container with id fc0109b40c70e3377301bc899a21e20e15116221825cd0d472950d5a2ac4f5ec Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.769021 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.857902 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.870921 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76b97b95d9-fmpzq"] Mar 10 21:55:59 crc kubenswrapper[4919]: W0310 21:55:59.874647 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5bfde0a_3ba2_44b8_a69d_ea5aa5551bf1.slice/crio-eb1902a3bcec9e62b9af2afcf9cdcabbd3770823e2af930ff5cfda97d44b128e WatchSource:0}: Error finding container eb1902a3bcec9e62b9af2afcf9cdcabbd3770823e2af930ff5cfda97d44b128e: Status 404 returned error can't find the container with id eb1902a3bcec9e62b9af2afcf9cdcabbd3770823e2af930ff5cfda97d44b128e Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.924129 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 21:55:59 crc kubenswrapper[4919]: I0310 21:55:59.965304 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.051914 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.075667 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.124955 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.126166 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.259484 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.302173 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.422969 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.581773 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.658650 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" event={"ID":"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1","Type":"ContainerStarted","Data":"2f760161472e81514642672e56519014db9653974e8acdc9e79b373df046f317"} Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.658724 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" event={"ID":"c5bfde0a-3ba2-44b8-a69d-ea5aa5551bf1","Type":"ContainerStarted","Data":"eb1902a3bcec9e62b9af2afcf9cdcabbd3770823e2af930ff5cfda97d44b128e"} Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.659138 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.660520 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" event={"ID":"e98e5bbe-c1d3-415d-85f6-daa1867d6a95","Type":"ContainerStarted","Data":"52de8c4e9f81282ef3454cb80cc0086751e8013510aab8cb2cf5877cf85ea8fa"} Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.660599 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" event={"ID":"e98e5bbe-c1d3-415d-85f6-daa1867d6a95","Type":"ContainerStarted","Data":"fc0109b40c70e3377301bc899a21e20e15116221825cd0d472950d5a2ac4f5ec"} Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.661104 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.671585 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.688591 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76b97b95d9-fmpzq" podStartSLOduration=36.688563771 podStartE2EDuration="36.688563771s" podCreationTimestamp="2026-03-10 21:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:56:00.684917532 +0000 UTC m=+347.926798150" watchObservedRunningTime="2026-03-10 21:56:00.688563771 +0000 UTC m=+347.930444449" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.692483 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.705571 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" podStartSLOduration=36.705545489 podStartE2EDuration="36.705545489s" podCreationTimestamp="2026-03-10 21:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:56:00.699356951 +0000 UTC m=+347.941237579" watchObservedRunningTime="2026-03-10 21:56:00.705545489 +0000 UTC m=+347.947426107" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.810239 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.900098 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 21:56:00 crc kubenswrapper[4919]: I0310 21:56:00.932316 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.119781 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.167270 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.167434 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.208347 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.269540 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.280036 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.374242 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.453922 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.518890 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.550174 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.574332 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.605114 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.648065 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.661083 4919 patch_prober.go:28] interesting pod/route-controller-manager-74785c5695-nzs2l container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.661189 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" podUID="e98e5bbe-c1d3-415d-85f6-daa1867d6a95" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.684380 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.774023 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.821122 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.823116 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.881470 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.890487 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.909352 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 21:56:01 crc kubenswrapper[4919]: I0310 21:56:01.979358 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.101690 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.179497 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.254453 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.293091 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.324833 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.343865 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.366849 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.393163 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.420529 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.475371 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.482363 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.518160 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.606221 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.674469 4919 patch_prober.go:28] interesting pod/route-controller-manager-74785c5695-nzs2l container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.674560 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" podUID="e98e5bbe-c1d3-415d-85f6-daa1867d6a95" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.678897 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.816259 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.888793 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.931959 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 21:56:02 crc kubenswrapper[4919]: I0310 21:56:02.979300 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.104242 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.235127 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.295661 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.392496 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.473124 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.511678 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.527496 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.536241 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.560335 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.632671 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.718893 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.791247 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 21:56:03 crc kubenswrapper[4919]: I0310 21:56:03.861589 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.073961 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.125949 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.194837 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.239994 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.291731 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.321783 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.342343 4919 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.343360 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.345885 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.377921 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.385119 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.413961 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.423445 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.435055 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.471706 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.479856 4919 scope.go:117] "RemoveContainer" containerID="cc5b3d1057a414afa5a04ecb72df782026e24d398a00e3d8d1249c042fceeea7" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.572774 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.610519 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.615519 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.637341 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.688636 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.691419 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.691474 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3493bf564c69520e790e834264be4f162d4489001981df26b2ac79b29fb8d1ff"} Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.696336 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.711708 4919 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.745762 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 21:56:04 crc kubenswrapper[4919]: I0310 21:56:04.832048 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.003071 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.194133 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.318697 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.336202 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.397270 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.519918 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.565343 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.593412 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.620522 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.674362 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.738674 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.794343 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.855760 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.876904 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.895563 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.900617 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.904063 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 21:56:05 crc kubenswrapper[4919]: I0310 21:56:05.920922 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.246514 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.281372 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.361954 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.433591 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.436248 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.458070 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.534343 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.577996 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.689851 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.764223 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.816143 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.874751 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.893764 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.973642 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 21:56:06 crc kubenswrapper[4919]: I0310 21:56:06.975806 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.072680 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.139660 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.154013 4919 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.157700 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.212095 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.270252 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.290724 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.291133 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.326806 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.328947 4919 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.377254 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.445572 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.508410 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.583589 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.584048 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.603526 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.641814 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.676011 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.719151 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.855602 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.919843 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 21:56:07 crc kubenswrapper[4919]: I0310 21:56:07.985589 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.009018 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.063836 4919 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.064101 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c" gracePeriod=5 Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.098021 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.151764 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.179876 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.235220 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.240802 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.325233 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.335852 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.398453 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74785c5695-nzs2l" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.432083 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.522333 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.524147 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.525402 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.564637 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.565609 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.669828 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.674000 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.863252 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.871160 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.877902 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.928342 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 21:56:08 crc kubenswrapper[4919]: I0310 21:56:08.938697 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.120854 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.207249 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.261362 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.300467 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.468852 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.538442 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.598023 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.616363 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.660356 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.752670 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.870512 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 21:56:09 crc kubenswrapper[4919]: I0310 21:56:09.930841 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.008029 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.018319 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552996-lcsjs"] Mar 10 21:56:10 crc kubenswrapper[4919]: E0310 21:56:10.018597 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.018631 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.018765 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.019220 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552996-lcsjs" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.022147 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.022375 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.024065 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.031557 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552996-lcsjs"] Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.078679 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.105980 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mh96\" (UniqueName: \"kubernetes.io/projected/1b9af79e-aa31-499c-b948-7e05f1bf1f7a-kube-api-access-2mh96\") pod \"auto-csr-approver-29552996-lcsjs\" (UID: \"1b9af79e-aa31-499c-b948-7e05f1bf1f7a\") " pod="openshift-infra/auto-csr-approver-29552996-lcsjs" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.144619 4919 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.207494 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mh96\" (UniqueName: \"kubernetes.io/projected/1b9af79e-aa31-499c-b948-7e05f1bf1f7a-kube-api-access-2mh96\") pod \"auto-csr-approver-29552996-lcsjs\" (UID: \"1b9af79e-aa31-499c-b948-7e05f1bf1f7a\") " pod="openshift-infra/auto-csr-approver-29552996-lcsjs" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.230011 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.241178 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mh96\" (UniqueName: \"kubernetes.io/projected/1b9af79e-aa31-499c-b948-7e05f1bf1f7a-kube-api-access-2mh96\") pod \"auto-csr-approver-29552996-lcsjs\" (UID: \"1b9af79e-aa31-499c-b948-7e05f1bf1f7a\") " pod="openshift-infra/auto-csr-approver-29552996-lcsjs" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.347058 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552996-lcsjs" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.412987 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.750094 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.800950 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552996-lcsjs"] Mar 10 21:56:10 crc kubenswrapper[4919]: W0310 21:56:10.804083 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b9af79e_aa31_499c_b948_7e05f1bf1f7a.slice/crio-61c23270295b80a340eeea96f189e949102316a89d7763e6cecde668daab089b WatchSource:0}: Error finding container 61c23270295b80a340eeea96f189e949102316a89d7763e6cecde668daab089b: Status 404 returned error can't find the container with id 61c23270295b80a340eeea96f189e949102316a89d7763e6cecde668daab089b Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.879466 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.961238 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 21:56:10 crc kubenswrapper[4919]: I0310 21:56:10.998780 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.021600 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.062308 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.121309 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.129933 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.220808 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.439041 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.640052 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.729786 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552996-lcsjs" event={"ID":"1b9af79e-aa31-499c-b948-7e05f1bf1f7a","Type":"ContainerStarted","Data":"61c23270295b80a340eeea96f189e949102316a89d7763e6cecde668daab089b"} Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.826114 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 21:56:11 crc kubenswrapper[4919]: I0310 21:56:11.995204 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 21:56:12 crc kubenswrapper[4919]: I0310 21:56:12.176881 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 21:56:12 crc kubenswrapper[4919]: I0310 21:56:12.194107 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 21:56:12 crc kubenswrapper[4919]: I0310 21:56:12.614372 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 21:56:12 crc kubenswrapper[4919]: I0310 21:56:12.633376 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.301272 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.661089 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.661167 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.743284 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.743422 4919 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c" exitCode=137 Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.743475 4919 scope.go:117] "RemoveContainer" containerID="24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.743516 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764173 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764229 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764254 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764309 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764344 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764376 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764412 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764412 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764493 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764663 4919 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764688 4919 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764706 4919 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.764725 4919 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.765615 4919 scope.go:117] "RemoveContainer" containerID="24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c" Mar 10 21:56:13 crc kubenswrapper[4919]: E0310 21:56:13.766010 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c\": container with ID starting with 24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c not found: ID does not exist" containerID="24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.766230 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c"} err="failed to get container status \"24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c\": rpc error: code = NotFound desc = could not find container \"24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c\": container with ID starting with 24b584a28a2186d65e043c0ab4a204fc2032084ba8d89ab938057f2d0219986c not found: ID does not exist" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.772473 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 21:56:13 crc kubenswrapper[4919]: I0310 21:56:13.865551 4919 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 21:56:14 crc kubenswrapper[4919]: I0310 21:56:14.146839 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 21:56:15 crc kubenswrapper[4919]: I0310 21:56:15.491801 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 21:56:17 crc kubenswrapper[4919]: I0310 21:56:17.779959 4919 generic.go:334] "Generic (PLEG): container finished" podID="1b9af79e-aa31-499c-b948-7e05f1bf1f7a" containerID="f0dd855d39a25655496478ab54dc29196521a8f74a3a5d4726502eb56102cc15" exitCode=0 Mar 10 21:56:17 crc kubenswrapper[4919]: I0310 21:56:17.780514 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552996-lcsjs" event={"ID":"1b9af79e-aa31-499c-b948-7e05f1bf1f7a","Type":"ContainerDied","Data":"f0dd855d39a25655496478ab54dc29196521a8f74a3a5d4726502eb56102cc15"} Mar 10 21:56:19 crc kubenswrapper[4919]: I0310 21:56:19.138208 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552996-lcsjs" Mar 10 21:56:19 crc kubenswrapper[4919]: I0310 21:56:19.238991 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mh96\" (UniqueName: \"kubernetes.io/projected/1b9af79e-aa31-499c-b948-7e05f1bf1f7a-kube-api-access-2mh96\") pod \"1b9af79e-aa31-499c-b948-7e05f1bf1f7a\" (UID: \"1b9af79e-aa31-499c-b948-7e05f1bf1f7a\") " Mar 10 21:56:19 crc kubenswrapper[4919]: I0310 21:56:19.244573 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9af79e-aa31-499c-b948-7e05f1bf1f7a-kube-api-access-2mh96" (OuterVolumeSpecName: "kube-api-access-2mh96") pod "1b9af79e-aa31-499c-b948-7e05f1bf1f7a" (UID: "1b9af79e-aa31-499c-b948-7e05f1bf1f7a"). InnerVolumeSpecName "kube-api-access-2mh96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:56:19 crc kubenswrapper[4919]: I0310 21:56:19.340689 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mh96\" (UniqueName: \"kubernetes.io/projected/1b9af79e-aa31-499c-b948-7e05f1bf1f7a-kube-api-access-2mh96\") on node \"crc\" DevicePath \"\"" Mar 10 21:56:19 crc kubenswrapper[4919]: I0310 21:56:19.793290 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552996-lcsjs" event={"ID":"1b9af79e-aa31-499c-b948-7e05f1bf1f7a","Type":"ContainerDied","Data":"61c23270295b80a340eeea96f189e949102316a89d7763e6cecde668daab089b"} Mar 10 21:56:19 crc kubenswrapper[4919]: I0310 21:56:19.793746 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61c23270295b80a340eeea96f189e949102316a89d7763e6cecde668daab089b" Mar 10 21:56:19 crc kubenswrapper[4919]: I0310 21:56:19.793846 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552996-lcsjs" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.903107 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l"] Mar 10 21:56:20 crc kubenswrapper[4919]: E0310 21:56:20.903460 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9af79e-aa31-499c-b948-7e05f1bf1f7a" containerName="oc" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.903480 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9af79e-aa31-499c-b948-7e05f1bf1f7a" containerName="oc" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.903639 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9af79e-aa31-499c-b948-7e05f1bf1f7a" containerName="oc" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.904288 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.908310 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.917778 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.917938 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.918973 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.920225 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.920417 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.920609 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.920817 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.921020 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l"] Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.921470 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.921593 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.921965 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.922727 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.930183 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.935611 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 21:56:20 crc kubenswrapper[4919]: I0310 21:56:20.938895 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.070614 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.070693 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.070740 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.070782 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.070824 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7wvq\" (UniqueName: \"kubernetes.io/projected/eccf8123-9a31-40d0-afa5-6207adb51c2d-kube-api-access-n7wvq\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071023 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071152 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071217 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071267 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071320 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-audit-policies\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071465 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071528 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eccf8123-9a31-40d0-afa5-6207adb51c2d-audit-dir\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071580 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.071652 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173321 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173474 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eccf8123-9a31-40d0-afa5-6207adb51c2d-audit-dir\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173538 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173563 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eccf8123-9a31-40d0-afa5-6207adb51c2d-audit-dir\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173599 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173660 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173723 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173775 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173829 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173879 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7wvq\" (UniqueName: \"kubernetes.io/projected/eccf8123-9a31-40d0-afa5-6207adb51c2d-kube-api-access-n7wvq\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.173959 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.174051 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.174113 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.174171 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.174228 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-audit-policies\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.175721 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-audit-policies\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.176990 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.180278 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.180373 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.181150 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.182087 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.182747 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.182957 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.183212 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.183798 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.185305 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.189572 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eccf8123-9a31-40d0-afa5-6207adb51c2d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.219199 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7wvq\" (UniqueName: \"kubernetes.io/projected/eccf8123-9a31-40d0-afa5-6207adb51c2d-kube-api-access-n7wvq\") pod \"oauth-openshift-79fc7cbfc-8hk8l\" (UID: \"eccf8123-9a31-40d0-afa5-6207adb51c2d\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.241360 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.654860 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l"] Mar 10 21:56:21 crc kubenswrapper[4919]: W0310 21:56:21.658858 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeccf8123_9a31_40d0_afa5_6207adb51c2d.slice/crio-a41ac77796104c274c7ff099d6777fa9f4afb57f8da3331b06fe1f34d9b8bcb0 WatchSource:0}: Error finding container a41ac77796104c274c7ff099d6777fa9f4afb57f8da3331b06fe1f34d9b8bcb0: Status 404 returned error can't find the container with id a41ac77796104c274c7ff099d6777fa9f4afb57f8da3331b06fe1f34d9b8bcb0 Mar 10 21:56:21 crc kubenswrapper[4919]: I0310 21:56:21.808038 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" event={"ID":"eccf8123-9a31-40d0-afa5-6207adb51c2d","Type":"ContainerStarted","Data":"a41ac77796104c274c7ff099d6777fa9f4afb57f8da3331b06fe1f34d9b8bcb0"} Mar 10 21:56:22 crc kubenswrapper[4919]: I0310 21:56:22.818013 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" event={"ID":"eccf8123-9a31-40d0-afa5-6207adb51c2d","Type":"ContainerStarted","Data":"c9840e8e7104a4fa40f72a44758310ddcdf0f9125cce3211b6b5384faf2a9437"} Mar 10 21:56:22 crc kubenswrapper[4919]: I0310 21:56:22.818535 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:22 crc kubenswrapper[4919]: I0310 21:56:22.823829 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" Mar 10 21:56:22 crc kubenswrapper[4919]: I0310 21:56:22.836356 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79fc7cbfc-8hk8l" podStartSLOduration=76.836331252 podStartE2EDuration="1m16.836331252s" podCreationTimestamp="2026-03-10 21:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:56:22.834984626 +0000 UTC m=+370.076865264" watchObservedRunningTime="2026-03-10 21:56:22.836331252 +0000 UTC m=+370.078211860" Mar 10 21:56:28 crc kubenswrapper[4919]: I0310 21:56:28.713224 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 21:56:59 crc kubenswrapper[4919]: I0310 21:56:59.176310 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 21:56:59 crc kubenswrapper[4919]: I0310 21:56:59.177117 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 21:57:29 crc kubenswrapper[4919]: I0310 21:57:29.176349 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 21:57:29 crc kubenswrapper[4919]: I0310 21:57:29.178816 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.848338 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lxjws"] Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.849507 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.866726 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lxjws"] Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.919305 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.919356 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vxzf\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-kube-api-access-8vxzf\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.919409 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.919444 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-bound-sa-token\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.919483 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-registry-certificates\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.919512 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-registry-tls\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.919753 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.919827 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-trusted-ca\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:36 crc kubenswrapper[4919]: I0310 21:57:36.944374 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.020939 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-bound-sa-token\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.021002 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-registry-certificates\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.021027 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-registry-tls\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.021054 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.021073 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-trusted-ca\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.021111 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.021129 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vxzf\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-kube-api-access-8vxzf\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.022266 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.022874 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-trusted-ca\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.023110 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-registry-certificates\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.031259 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.032026 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-registry-tls\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.038894 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-bound-sa-token\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.039720 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vxzf\" (UniqueName: \"kubernetes.io/projected/c01c91ec-c73c-49df-bdfe-c1f8c65b594a-kube-api-access-8vxzf\") pod \"image-registry-66df7c8f76-lxjws\" (UID: \"c01c91ec-c73c-49df-bdfe-c1f8c65b594a\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.166432 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.586256 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lxjws"] Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.752674 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" event={"ID":"c01c91ec-c73c-49df-bdfe-c1f8c65b594a","Type":"ContainerStarted","Data":"cf67e49dfd0749886587f7c249ae5f86bc0369b028767b319650a1422c27607b"} Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.753174 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.753362 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" event={"ID":"c01c91ec-c73c-49df-bdfe-c1f8c65b594a","Type":"ContainerStarted","Data":"4e03d257f36dbef0d7fd8b6417510239736716404f2fa9b56adcc754a98dd323"} Mar 10 21:57:37 crc kubenswrapper[4919]: I0310 21:57:37.774921 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" podStartSLOduration=1.7748954590000001 podStartE2EDuration="1.774895459s" podCreationTimestamp="2026-03-10 21:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:57:37.773309676 +0000 UTC m=+445.015190304" watchObservedRunningTime="2026-03-10 21:57:37.774895459 +0000 UTC m=+445.016776077" Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.128879 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s8qvz"] Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.131785 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s8qvz" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerName="registry-server" containerID="cri-o://7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1" gracePeriod=30 Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.136862 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pw22n"] Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.137207 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pw22n" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerName="registry-server" containerID="cri-o://f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2" gracePeriod=30 Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.155701 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tk7xs"] Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.156283 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" podUID="be41b09e-a8ff-4367-a68d-865f047e2549" containerName="marketplace-operator" containerID="cri-o://79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3" gracePeriod=30 Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.199039 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnq5q"] Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.199420 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lnq5q" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="registry-server" containerID="cri-o://106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8" gracePeriod=30 Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.206447 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f4nt4"] Mar 10 21:57:46 crc kubenswrapper[4919]: I0310 21:57:46.206628 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f4nt4" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="registry-server" containerID="cri-o://675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2" gracePeriod=30 Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.163265 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w82bl"] Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.164987 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.178510 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w82bl"] Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.304851 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmtlx\" (UniqueName: \"kubernetes.io/projected/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-kube-api-access-qmtlx\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.304957 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.305068 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.406729 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.406867 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.406902 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmtlx\" (UniqueName: \"kubernetes.io/projected/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-kube-api-access-qmtlx\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.409019 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.413380 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.422257 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmtlx\" (UniqueName: \"kubernetes.io/projected/eacecaf1-f17c-4c5e-8a68-8b1cb1e01006-kube-api-access-qmtlx\") pod \"marketplace-operator-79b997595-w82bl\" (UID: \"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.506714 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:47 crc kubenswrapper[4919]: E0310 21:57:47.538858 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2 is running failed: container process not found" containerID="675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 21:57:47 crc kubenswrapper[4919]: E0310 21:57:47.540604 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2 is running failed: container process not found" containerID="675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 21:57:47 crc kubenswrapper[4919]: E0310 21:57:47.540928 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2 is running failed: container process not found" containerID="675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 21:57:47 crc kubenswrapper[4919]: E0310 21:57:47.540981 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-f4nt4" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="registry-server" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.751496 4919 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tk7xs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.751850 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" podUID="be41b09e-a8ff-4367-a68d-865f047e2549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.901467 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w82bl"] Mar 10 21:57:47 crc kubenswrapper[4919]: I0310 21:57:47.958341 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.016241 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.045684 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.051326 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.100258 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.113713 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-catalog-content\") pod \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.113821 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-utilities\") pod \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.113872 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrrpd\" (UniqueName: \"kubernetes.io/projected/be41b09e-a8ff-4367-a68d-865f047e2549-kube-api-access-nrrpd\") pod \"be41b09e-a8ff-4367-a68d-865f047e2549\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.113897 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-operator-metrics\") pod \"be41b09e-a8ff-4367-a68d-865f047e2549\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.113926 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-trusted-ca\") pod \"be41b09e-a8ff-4367-a68d-865f047e2549\" (UID: \"be41b09e-a8ff-4367-a68d-865f047e2549\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.113986 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grznq\" (UniqueName: \"kubernetes.io/projected/ccd7b53d-726b-444f-be0f-4eb2655eb35d-kube-api-access-grznq\") pod \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\" (UID: \"ccd7b53d-726b-444f-be0f-4eb2655eb35d\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.116678 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "be41b09e-a8ff-4367-a68d-865f047e2549" (UID: "be41b09e-a8ff-4367-a68d-865f047e2549"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.117733 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-utilities" (OuterVolumeSpecName: "utilities") pod "ccd7b53d-726b-444f-be0f-4eb2655eb35d" (UID: "ccd7b53d-726b-444f-be0f-4eb2655eb35d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.123943 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd7b53d-726b-444f-be0f-4eb2655eb35d-kube-api-access-grznq" (OuterVolumeSpecName: "kube-api-access-grznq") pod "ccd7b53d-726b-444f-be0f-4eb2655eb35d" (UID: "ccd7b53d-726b-444f-be0f-4eb2655eb35d"). InnerVolumeSpecName "kube-api-access-grznq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.125133 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be41b09e-a8ff-4367-a68d-865f047e2549-kube-api-access-nrrpd" (OuterVolumeSpecName: "kube-api-access-nrrpd") pod "be41b09e-a8ff-4367-a68d-865f047e2549" (UID: "be41b09e-a8ff-4367-a68d-865f047e2549"). InnerVolumeSpecName "kube-api-access-nrrpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.126202 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "be41b09e-a8ff-4367-a68d-865f047e2549" (UID: "be41b09e-a8ff-4367-a68d-865f047e2549"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.126594 4919 generic.go:334] "Generic (PLEG): container finished" podID="28b0abdd-217d-42f6-80fb-b270be44700e" containerID="675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2" exitCode=0 Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.126648 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4nt4" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.126651 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4nt4" event={"ID":"28b0abdd-217d-42f6-80fb-b270be44700e","Type":"ContainerDied","Data":"675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.126759 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4nt4" event={"ID":"28b0abdd-217d-42f6-80fb-b270be44700e","Type":"ContainerDied","Data":"f510b9e8c3ce17c9809ee1c110751a266af6add07e24854fbeb29359742a2325"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.126776 4919 scope.go:117] "RemoveContainer" containerID="675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.128008 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" event={"ID":"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006","Type":"ContainerStarted","Data":"3bfd66b45e350fa7f8b1c27ad9547cdf12c1a2ac68ba159c2003ffff119056a0"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.128033 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" event={"ID":"eacecaf1-f17c-4c5e-8a68-8b1cb1e01006","Type":"ContainerStarted","Data":"d185f10979799b8cab15be728d79cc86fbbcf591050cf574e2ad50a0d5d5eb9b"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.128129 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.130161 4919 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-w82bl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" start-of-body= Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.130239 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" podUID="eacecaf1-f17c-4c5e-8a68-8b1cb1e01006" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.133270 4919 generic.go:334] "Generic (PLEG): container finished" podID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerID="7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1" exitCode=0 Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.133685 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8qvz" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.133699 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8qvz" event={"ID":"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35","Type":"ContainerDied","Data":"7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.134052 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8qvz" event={"ID":"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35","Type":"ContainerDied","Data":"4fa9169080c49963d92bfa35ee923d40cf27f01251463c2e09e01ccb6c22d580"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.143686 4919 generic.go:334] "Generic (PLEG): container finished" podID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerID="f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2" exitCode=0 Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.143753 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw22n" event={"ID":"ccd7b53d-726b-444f-be0f-4eb2655eb35d","Type":"ContainerDied","Data":"f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.143781 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw22n" event={"ID":"ccd7b53d-726b-444f-be0f-4eb2655eb35d","Type":"ContainerDied","Data":"f4a975618ed9f405ef7cc99d497ec3e1aaee50b9ea6a3215811b178c30828a92"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.143850 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw22n" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.147829 4919 generic.go:334] "Generic (PLEG): container finished" podID="be41b09e-a8ff-4367-a68d-865f047e2549" containerID="79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3" exitCode=0 Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.147876 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" event={"ID":"be41b09e-a8ff-4367-a68d-865f047e2549","Type":"ContainerDied","Data":"79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.147898 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" event={"ID":"be41b09e-a8ff-4367-a68d-865f047e2549","Type":"ContainerDied","Data":"1a9115de1b9b20728ad7a9875a64a22506575a7e4340c4e5055cee337bbf568d"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.147944 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tk7xs" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.155219 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" podStartSLOduration=1.155200142 podStartE2EDuration="1.155200142s" podCreationTimestamp="2026-03-10 21:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 21:57:48.147039831 +0000 UTC m=+455.388920439" watchObservedRunningTime="2026-03-10 21:57:48.155200142 +0000 UTC m=+455.397080750" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.156134 4919 generic.go:334] "Generic (PLEG): container finished" podID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerID="106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8" exitCode=0 Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.156179 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnq5q" event={"ID":"df08dbe0-09b0-4d23-b99b-95b65818f84e","Type":"ContainerDied","Data":"106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.156210 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnq5q" event={"ID":"df08dbe0-09b0-4d23-b99b-95b65818f84e","Type":"ContainerDied","Data":"392945f53aec564b04401d2f253b4e68fe7aa032fac26801a08555b4f158ad50"} Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.156420 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnq5q" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.163759 4919 scope.go:117] "RemoveContainer" containerID="b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.178181 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tk7xs"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.182842 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tk7xs"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.188836 4919 scope.go:117] "RemoveContainer" containerID="3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.206911 4919 scope.go:117] "RemoveContainer" containerID="675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.207641 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2\": container with ID starting with 675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2 not found: ID does not exist" containerID="675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.207701 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2"} err="failed to get container status \"675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2\": rpc error: code = NotFound desc = could not find container \"675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2\": container with ID starting with 675d75fb6cc1513db61486bd6b52cb07921e449d2d6906982647a01e512574d2 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.207727 4919 scope.go:117] "RemoveContainer" containerID="b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.208225 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1\": container with ID starting with b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1 not found: ID does not exist" containerID="b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.208285 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1"} err="failed to get container status \"b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1\": rpc error: code = NotFound desc = could not find container \"b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1\": container with ID starting with b0da1bc568b246d585ead08e419bfd3f4b860bfd0e962d85d7df6d91963f3af1 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.208313 4919 scope.go:117] "RemoveContainer" containerID="3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.209077 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccd7b53d-726b-444f-be0f-4eb2655eb35d" (UID: "ccd7b53d-726b-444f-be0f-4eb2655eb35d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.210047 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb\": container with ID starting with 3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb not found: ID does not exist" containerID="3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.210075 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb"} err="failed to get container status \"3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb\": rpc error: code = NotFound desc = could not find container \"3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb\": container with ID starting with 3f4a81ec14abfa74518b7051132da3a4ad2c7106815e295f014892ed1e40fccb not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.210088 4919 scope.go:117] "RemoveContainer" containerID="7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.215162 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-utilities\") pod \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.215212 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-catalog-content\") pod \"df08dbe0-09b0-4d23-b99b-95b65818f84e\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.215242 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbtv9\" (UniqueName: \"kubernetes.io/projected/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-kube-api-access-cbtv9\") pod \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.215273 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-catalog-content\") pod \"28b0abdd-217d-42f6-80fb-b270be44700e\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.215301 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgvkg\" (UniqueName: \"kubernetes.io/projected/28b0abdd-217d-42f6-80fb-b270be44700e-kube-api-access-bgvkg\") pod \"28b0abdd-217d-42f6-80fb-b270be44700e\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.215340 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-utilities\") pod \"df08dbe0-09b0-4d23-b99b-95b65818f84e\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.215366 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-utilities\") pod \"28b0abdd-217d-42f6-80fb-b270be44700e\" (UID: \"28b0abdd-217d-42f6-80fb-b270be44700e\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.215902 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-utilities" (OuterVolumeSpecName: "utilities") pod "b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" (UID: "b8a6c263-cf9a-41f9-8ea0-fb07b0596a35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.216503 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88pcz\" (UniqueName: \"kubernetes.io/projected/df08dbe0-09b0-4d23-b99b-95b65818f84e-kube-api-access-88pcz\") pod \"df08dbe0-09b0-4d23-b99b-95b65818f84e\" (UID: \"df08dbe0-09b0-4d23-b99b-95b65818f84e\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.216898 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-catalog-content\") pod \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\" (UID: \"b8a6c263-cf9a-41f9-8ea0-fb07b0596a35\") " Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.217479 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrrpd\" (UniqueName: \"kubernetes.io/projected/be41b09e-a8ff-4367-a68d-865f047e2549-kube-api-access-nrrpd\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.217568 4919 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.218621 4919 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be41b09e-a8ff-4367-a68d-865f047e2549-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.218645 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grznq\" (UniqueName: \"kubernetes.io/projected/ccd7b53d-726b-444f-be0f-4eb2655eb35d-kube-api-access-grznq\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.218656 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.218681 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.218692 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd7b53d-726b-444f-be0f-4eb2655eb35d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.216598 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-utilities" (OuterVolumeSpecName: "utilities") pod "df08dbe0-09b0-4d23-b99b-95b65818f84e" (UID: "df08dbe0-09b0-4d23-b99b-95b65818f84e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.217462 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-utilities" (OuterVolumeSpecName: "utilities") pod "28b0abdd-217d-42f6-80fb-b270be44700e" (UID: "28b0abdd-217d-42f6-80fb-b270be44700e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.219509 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-kube-api-access-cbtv9" (OuterVolumeSpecName: "kube-api-access-cbtv9") pod "b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" (UID: "b8a6c263-cf9a-41f9-8ea0-fb07b0596a35"). InnerVolumeSpecName "kube-api-access-cbtv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.227858 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b0abdd-217d-42f6-80fb-b270be44700e-kube-api-access-bgvkg" (OuterVolumeSpecName: "kube-api-access-bgvkg") pod "28b0abdd-217d-42f6-80fb-b270be44700e" (UID: "28b0abdd-217d-42f6-80fb-b270be44700e"). InnerVolumeSpecName "kube-api-access-bgvkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.235355 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df08dbe0-09b0-4d23-b99b-95b65818f84e-kube-api-access-88pcz" (OuterVolumeSpecName: "kube-api-access-88pcz") pod "df08dbe0-09b0-4d23-b99b-95b65818f84e" (UID: "df08dbe0-09b0-4d23-b99b-95b65818f84e"). InnerVolumeSpecName "kube-api-access-88pcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.244226 4919 scope.go:117] "RemoveContainer" containerID="5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.252515 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df08dbe0-09b0-4d23-b99b-95b65818f84e" (UID: "df08dbe0-09b0-4d23-b99b-95b65818f84e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.269091 4919 scope.go:117] "RemoveContainer" containerID="06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.283897 4919 scope.go:117] "RemoveContainer" containerID="7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.284331 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1\": container with ID starting with 7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1 not found: ID does not exist" containerID="7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.284371 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1"} err="failed to get container status \"7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1\": rpc error: code = NotFound desc = could not find container \"7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1\": container with ID starting with 7e52e72bd6cec0cf50811ec53ec531224f23725c50a9a5e4de595dfd7bc9d6a1 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.284536 4919 scope.go:117] "RemoveContainer" containerID="5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.284875 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7\": container with ID starting with 5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7 not found: ID does not exist" containerID="5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.284912 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7"} err="failed to get container status \"5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7\": rpc error: code = NotFound desc = could not find container \"5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7\": container with ID starting with 5597d9887052ad3bb5f7a47a09eb3d40624bd526912ef8b0e30e5d988b40f6e7 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.284934 4919 scope.go:117] "RemoveContainer" containerID="06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.285234 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e\": container with ID starting with 06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e not found: ID does not exist" containerID="06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.285262 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e"} err="failed to get container status \"06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e\": rpc error: code = NotFound desc = could not find container \"06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e\": container with ID starting with 06a16c299397b98c6a44c20dce3634f6cff08dcaaba9370c80d0da732dbff08e not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.285281 4919 scope.go:117] "RemoveContainer" containerID="f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.296224 4919 scope.go:117] "RemoveContainer" containerID="55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.310194 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" (UID: "b8a6c263-cf9a-41f9-8ea0-fb07b0596a35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.311279 4919 scope.go:117] "RemoveContainer" containerID="b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.320083 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgvkg\" (UniqueName: \"kubernetes.io/projected/28b0abdd-217d-42f6-80fb-b270be44700e-kube-api-access-bgvkg\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.320118 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.320131 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.320143 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88pcz\" (UniqueName: \"kubernetes.io/projected/df08dbe0-09b0-4d23-b99b-95b65818f84e-kube-api-access-88pcz\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.320154 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.320173 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df08dbe0-09b0-4d23-b99b-95b65818f84e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.320187 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbtv9\" (UniqueName: \"kubernetes.io/projected/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35-kube-api-access-cbtv9\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.325064 4919 scope.go:117] "RemoveContainer" containerID="f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.325727 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2\": container with ID starting with f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2 not found: ID does not exist" containerID="f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.325766 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2"} err="failed to get container status \"f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2\": rpc error: code = NotFound desc = could not find container \"f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2\": container with ID starting with f9f25006ddf0e6f5079d6de5f9c62723dfd9c3d04a9dd09f0e189bf16aee56e2 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.325797 4919 scope.go:117] "RemoveContainer" containerID="55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.326279 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d\": container with ID starting with 55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d not found: ID does not exist" containerID="55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.326323 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d"} err="failed to get container status \"55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d\": rpc error: code = NotFound desc = could not find container \"55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d\": container with ID starting with 55a70b9856c0e8fe8419127b5aebdd393cc009ccdd1977e81c88f3675445fc5d not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.326358 4919 scope.go:117] "RemoveContainer" containerID="b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.326684 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55\": container with ID starting with b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55 not found: ID does not exist" containerID="b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.326724 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55"} err="failed to get container status \"b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55\": rpc error: code = NotFound desc = could not find container \"b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55\": container with ID starting with b531133482d76a52b6832a37a5b7850b4438892b3a655d3ef66716ff4e48bb55 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.326751 4919 scope.go:117] "RemoveContainer" containerID="79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.339810 4919 scope.go:117] "RemoveContainer" containerID="79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.340232 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3\": container with ID starting with 79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3 not found: ID does not exist" containerID="79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.340263 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3"} err="failed to get container status \"79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3\": rpc error: code = NotFound desc = could not find container \"79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3\": container with ID starting with 79e6104fadf71887acd4c73fd5b0783822885bc0c62332193a11ef5b30efabf3 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.340283 4919 scope.go:117] "RemoveContainer" containerID="106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.351272 4919 scope.go:117] "RemoveContainer" containerID="b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.364378 4919 scope.go:117] "RemoveContainer" containerID="8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.374550 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28b0abdd-217d-42f6-80fb-b270be44700e" (UID: "28b0abdd-217d-42f6-80fb-b270be44700e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.375237 4919 scope.go:117] "RemoveContainer" containerID="106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.375802 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8\": container with ID starting with 106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8 not found: ID does not exist" containerID="106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.375843 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8"} err="failed to get container status \"106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8\": rpc error: code = NotFound desc = could not find container \"106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8\": container with ID starting with 106158adf1276549cbb2b41b4d8fd567346d8c5242635f4b25ce5ebec21b8df8 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.375874 4919 scope.go:117] "RemoveContainer" containerID="b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.376721 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e\": container with ID starting with b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e not found: ID does not exist" containerID="b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.376763 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e"} err="failed to get container status \"b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e\": rpc error: code = NotFound desc = could not find container \"b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e\": container with ID starting with b084e7f86183c0b0f5efc539c2ea8a877ad9f32b20da9de9012d85755165062e not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.376786 4919 scope.go:117] "RemoveContainer" containerID="8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.377080 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1\": container with ID starting with 8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1 not found: ID does not exist" containerID="8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.377133 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1"} err="failed to get container status \"8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1\": rpc error: code = NotFound desc = could not find container \"8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1\": container with ID starting with 8e48b76e7b85e1c154707155fdb33a2e17370b61feababa3d099aad57c1b2fb1 not found: ID does not exist" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.421155 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b0abdd-217d-42f6-80fb-b270be44700e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.479306 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f4nt4"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.482032 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f4nt4"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.491448 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pw22n"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.496775 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pw22n"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.502474 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s8qvz"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.514462 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s8qvz"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.517880 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnq5q"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.520453 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnq5q"] Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.938763 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xxm8b"] Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939007 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerName="extract-utilities" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939022 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerName="extract-utilities" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939034 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be41b09e-a8ff-4367-a68d-865f047e2549" containerName="marketplace-operator" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939042 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="be41b09e-a8ff-4367-a68d-865f047e2549" containerName="marketplace-operator" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939053 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="extract-utilities" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939061 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="extract-utilities" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939068 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939075 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939084 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerName="extract-content" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939093 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerName="extract-content" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939102 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="extract-content" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939109 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="extract-content" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939117 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="extract-content" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939124 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="extract-content" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939136 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939144 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939156 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerName="extract-utilities" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939164 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerName="extract-utilities" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939176 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939183 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939191 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939199 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939212 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="extract-utilities" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939221 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="extract-utilities" Mar 10 21:57:48 crc kubenswrapper[4919]: E0310 21:57:48.939233 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerName="extract-content" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939242 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerName="extract-content" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939363 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="be41b09e-a8ff-4367-a68d-865f047e2549" containerName="marketplace-operator" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939374 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939383 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939409 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.939419 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" containerName="registry-server" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.940289 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.942206 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 21:57:48 crc kubenswrapper[4919]: I0310 21:57:48.963488 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxm8b"] Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.132029 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvm6\" (UniqueName: \"kubernetes.io/projected/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-kube-api-access-xsvm6\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.132337 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-utilities\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.132520 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-catalog-content\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.172855 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-w82bl" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.233775 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvm6\" (UniqueName: \"kubernetes.io/projected/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-kube-api-access-xsvm6\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.233844 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-utilities\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.233893 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-catalog-content\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.234446 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-catalog-content\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.234540 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-utilities\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.259410 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvm6\" (UniqueName: \"kubernetes.io/projected/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-kube-api-access-xsvm6\") pod \"certified-operators-xxm8b\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.490065 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b0abdd-217d-42f6-80fb-b270be44700e" path="/var/lib/kubelet/pods/28b0abdd-217d-42f6-80fb-b270be44700e/volumes" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.490833 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a6c263-cf9a-41f9-8ea0-fb07b0596a35" path="/var/lib/kubelet/pods/b8a6c263-cf9a-41f9-8ea0-fb07b0596a35/volumes" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.491596 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be41b09e-a8ff-4367-a68d-865f047e2549" path="/var/lib/kubelet/pods/be41b09e-a8ff-4367-a68d-865f047e2549/volumes" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.492642 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd7b53d-726b-444f-be0f-4eb2655eb35d" path="/var/lib/kubelet/pods/ccd7b53d-726b-444f-be0f-4eb2655eb35d/volumes" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.493342 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df08dbe0-09b0-4d23-b99b-95b65818f84e" path="/var/lib/kubelet/pods/df08dbe0-09b0-4d23-b99b-95b65818f84e/volumes" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.558642 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.941622 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b68ml"] Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.943682 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.946114 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.948082 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b68ml"] Mar 10 21:57:49 crc kubenswrapper[4919]: I0310 21:57:49.955383 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxm8b"] Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.045008 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7scp\" (UniqueName: \"kubernetes.io/projected/d9defb14-2db3-40d8-8081-495961bfedf1-kube-api-access-v7scp\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.045093 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-utilities\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.045147 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-catalog-content\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.145972 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7scp\" (UniqueName: \"kubernetes.io/projected/d9defb14-2db3-40d8-8081-495961bfedf1-kube-api-access-v7scp\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.146348 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-utilities\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.146375 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-catalog-content\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.146827 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-catalog-content\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.146982 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-utilities\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.176199 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7scp\" (UniqueName: \"kubernetes.io/projected/d9defb14-2db3-40d8-8081-495961bfedf1-kube-api-access-v7scp\") pod \"redhat-operators-b68ml\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.178193 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxm8b" event={"ID":"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7","Type":"ContainerStarted","Data":"a23955d6b9537b7ea53621c7eaec30979ef2a78541d13857d5565331ed34d329"} Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.331266 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:57:50 crc kubenswrapper[4919]: W0310 21:57:50.537922 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9defb14_2db3_40d8_8081_495961bfedf1.slice/crio-2aaa98ac37889704c3ade6e5d1e4c8437eaa1ef52ea4401b9525268307f43b32 WatchSource:0}: Error finding container 2aaa98ac37889704c3ade6e5d1e4c8437eaa1ef52ea4401b9525268307f43b32: Status 404 returned error can't find the container with id 2aaa98ac37889704c3ade6e5d1e4c8437eaa1ef52ea4401b9525268307f43b32 Mar 10 21:57:50 crc kubenswrapper[4919]: I0310 21:57:50.540714 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b68ml"] Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.189810 4919 generic.go:334] "Generic (PLEG): container finished" podID="d9defb14-2db3-40d8-8081-495961bfedf1" containerID="6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea" exitCode=0 Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.190149 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b68ml" event={"ID":"d9defb14-2db3-40d8-8081-495961bfedf1","Type":"ContainerDied","Data":"6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea"} Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.190177 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b68ml" event={"ID":"d9defb14-2db3-40d8-8081-495961bfedf1","Type":"ContainerStarted","Data":"2aaa98ac37889704c3ade6e5d1e4c8437eaa1ef52ea4401b9525268307f43b32"} Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.193911 4919 generic.go:334] "Generic (PLEG): container finished" podID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerID="b8a921d0eee6a88332744ea0594fdd10a0f2ec934cdb3cead809eae0572affa6" exitCode=0 Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.193930 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxm8b" event={"ID":"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7","Type":"ContainerDied","Data":"b8a921d0eee6a88332744ea0594fdd10a0f2ec934cdb3cead809eae0572affa6"} Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.344678 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tbwp6"] Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.345711 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.348560 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.359067 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbwp6"] Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.463132 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-catalog-content\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.463233 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8nsv\" (UniqueName: \"kubernetes.io/projected/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-kube-api-access-w8nsv\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.463292 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-utilities\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.564938 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-catalog-content\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.565001 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8nsv\" (UniqueName: \"kubernetes.io/projected/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-kube-api-access-w8nsv\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.565089 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-utilities\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.565777 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-utilities\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.565817 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-catalog-content\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.589210 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8nsv\" (UniqueName: \"kubernetes.io/projected/eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f-kube-api-access-w8nsv\") pod \"community-operators-tbwp6\" (UID: \"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f\") " pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:51 crc kubenswrapper[4919]: I0310 21:57:51.667444 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.057229 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbwp6"] Mar 10 21:57:52 crc kubenswrapper[4919]: W0310 21:57:52.064517 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb2ad0d8_c5f7_43c3_b37c_2ec6c330703f.slice/crio-e4f7f038fb1fb58c63418440bbf62a1b8866dd979443c4ce73e1098f9be2a798 WatchSource:0}: Error finding container e4f7f038fb1fb58c63418440bbf62a1b8866dd979443c4ce73e1098f9be2a798: Status 404 returned error can't find the container with id e4f7f038fb1fb58c63418440bbf62a1b8866dd979443c4ce73e1098f9be2a798 Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.199258 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbwp6" event={"ID":"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f","Type":"ContainerStarted","Data":"41d71ef6301ace56a43936309ddd485d1b59f9d2edea706578fad17d9a601a3c"} Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.199297 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbwp6" event={"ID":"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f","Type":"ContainerStarted","Data":"e4f7f038fb1fb58c63418440bbf62a1b8866dd979443c4ce73e1098f9be2a798"} Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.349516 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qjwrf"] Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.350817 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.352932 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.357269 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjwrf"] Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.475980 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa4689a-13af-4ed2-b9be-699f7bd519c1-catalog-content\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.476029 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa4689a-13af-4ed2-b9be-699f7bd519c1-utilities\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.476076 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw449\" (UniqueName: \"kubernetes.io/projected/3fa4689a-13af-4ed2-b9be-699f7bd519c1-kube-api-access-sw449\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.577185 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa4689a-13af-4ed2-b9be-699f7bd519c1-catalog-content\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.577467 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa4689a-13af-4ed2-b9be-699f7bd519c1-utilities\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.577514 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw449\" (UniqueName: \"kubernetes.io/projected/3fa4689a-13af-4ed2-b9be-699f7bd519c1-kube-api-access-sw449\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.578205 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa4689a-13af-4ed2-b9be-699f7bd519c1-catalog-content\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.578450 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa4689a-13af-4ed2-b9be-699f7bd519c1-utilities\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.601371 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw449\" (UniqueName: \"kubernetes.io/projected/3fa4689a-13af-4ed2-b9be-699f7bd519c1-kube-api-access-sw449\") pod \"redhat-marketplace-qjwrf\" (UID: \"3fa4689a-13af-4ed2-b9be-699f7bd519c1\") " pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:52 crc kubenswrapper[4919]: I0310 21:57:52.756913 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:57:53 crc kubenswrapper[4919]: I0310 21:57:53.205989 4919 generic.go:334] "Generic (PLEG): container finished" podID="eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f" containerID="41d71ef6301ace56a43936309ddd485d1b59f9d2edea706578fad17d9a601a3c" exitCode=0 Mar 10 21:57:53 crc kubenswrapper[4919]: I0310 21:57:53.206047 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbwp6" event={"ID":"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f","Type":"ContainerDied","Data":"41d71ef6301ace56a43936309ddd485d1b59f9d2edea706578fad17d9a601a3c"} Mar 10 21:57:53 crc kubenswrapper[4919]: I0310 21:57:53.256990 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b68ml" event={"ID":"d9defb14-2db3-40d8-8081-495961bfedf1","Type":"ContainerStarted","Data":"96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370"} Mar 10 21:57:53 crc kubenswrapper[4919]: I0310 21:57:53.271911 4919 generic.go:334] "Generic (PLEG): container finished" podID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerID="ca089684278834a339b5a6889d4cf85122def52b92d6ca8d6229825ec05e7f62" exitCode=0 Mar 10 21:57:53 crc kubenswrapper[4919]: I0310 21:57:53.271957 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxm8b" event={"ID":"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7","Type":"ContainerDied","Data":"ca089684278834a339b5a6889d4cf85122def52b92d6ca8d6229825ec05e7f62"} Mar 10 21:57:54 crc kubenswrapper[4919]: I0310 21:57:54.170570 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjwrf"] Mar 10 21:57:54 crc kubenswrapper[4919]: I0310 21:57:54.278660 4919 generic.go:334] "Generic (PLEG): container finished" podID="eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f" containerID="b88fd52e4a0cf8f94b1f1b6f7d3446ebd6db0eea43b1c61f26724b45091df042" exitCode=0 Mar 10 21:57:54 crc kubenswrapper[4919]: I0310 21:57:54.278726 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbwp6" event={"ID":"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f","Type":"ContainerDied","Data":"b88fd52e4a0cf8f94b1f1b6f7d3446ebd6db0eea43b1c61f26724b45091df042"} Mar 10 21:57:54 crc kubenswrapper[4919]: I0310 21:57:54.282550 4919 generic.go:334] "Generic (PLEG): container finished" podID="d9defb14-2db3-40d8-8081-495961bfedf1" containerID="96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370" exitCode=0 Mar 10 21:57:54 crc kubenswrapper[4919]: I0310 21:57:54.282587 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b68ml" event={"ID":"d9defb14-2db3-40d8-8081-495961bfedf1","Type":"ContainerDied","Data":"96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370"} Mar 10 21:57:54 crc kubenswrapper[4919]: I0310 21:57:54.293003 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjwrf" event={"ID":"3fa4689a-13af-4ed2-b9be-699f7bd519c1","Type":"ContainerStarted","Data":"0e563ae8f48cd7de442878558adea4341728295879e357d37abf39665fcefcbc"} Mar 10 21:57:54 crc kubenswrapper[4919]: I0310 21:57:54.299336 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxm8b" event={"ID":"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7","Type":"ContainerStarted","Data":"5aee0092c1d493f23deb1789ca4f715fc37b89f6a7cbd18ecec584d2dfecb870"} Mar 10 21:57:54 crc kubenswrapper[4919]: I0310 21:57:54.342123 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xxm8b" podStartSLOduration=3.844212068 podStartE2EDuration="6.342108592s" podCreationTimestamp="2026-03-10 21:57:48 +0000 UTC" firstStartedPulling="2026-03-10 21:57:51.19836999 +0000 UTC m=+458.440250598" lastFinishedPulling="2026-03-10 21:57:53.696266514 +0000 UTC m=+460.938147122" observedRunningTime="2026-03-10 21:57:54.3387375 +0000 UTC m=+461.580618108" watchObservedRunningTime="2026-03-10 21:57:54.342108592 +0000 UTC m=+461.583989200" Mar 10 21:57:55 crc kubenswrapper[4919]: I0310 21:57:55.306831 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbwp6" event={"ID":"eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f","Type":"ContainerStarted","Data":"e16c5765d7c8a49698cb435b1810cf2135cff0e5f4d092e1cce622c43c731cbf"} Mar 10 21:57:55 crc kubenswrapper[4919]: I0310 21:57:55.309775 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b68ml" event={"ID":"d9defb14-2db3-40d8-8081-495961bfedf1","Type":"ContainerStarted","Data":"b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6"} Mar 10 21:57:55 crc kubenswrapper[4919]: I0310 21:57:55.311518 4919 generic.go:334] "Generic (PLEG): container finished" podID="3fa4689a-13af-4ed2-b9be-699f7bd519c1" containerID="d55bf695c1465326c236af4a50057dbe5ad7552319cb38a3b13703572a503e81" exitCode=0 Mar 10 21:57:55 crc kubenswrapper[4919]: I0310 21:57:55.311541 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjwrf" event={"ID":"3fa4689a-13af-4ed2-b9be-699f7bd519c1","Type":"ContainerDied","Data":"d55bf695c1465326c236af4a50057dbe5ad7552319cb38a3b13703572a503e81"} Mar 10 21:57:55 crc kubenswrapper[4919]: I0310 21:57:55.344221 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tbwp6" podStartSLOduration=2.881613673 podStartE2EDuration="4.34420693s" podCreationTimestamp="2026-03-10 21:57:51 +0000 UTC" firstStartedPulling="2026-03-10 21:57:53.208267713 +0000 UTC m=+460.450148331" lastFinishedPulling="2026-03-10 21:57:54.67086097 +0000 UTC m=+461.912741588" observedRunningTime="2026-03-10 21:57:55.343455179 +0000 UTC m=+462.585335797" watchObservedRunningTime="2026-03-10 21:57:55.34420693 +0000 UTC m=+462.586087538" Mar 10 21:57:55 crc kubenswrapper[4919]: I0310 21:57:55.369247 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b68ml" podStartSLOduration=2.8646624469999997 podStartE2EDuration="6.369229574s" podCreationTimestamp="2026-03-10 21:57:49 +0000 UTC" firstStartedPulling="2026-03-10 21:57:51.192207792 +0000 UTC m=+458.434088400" lastFinishedPulling="2026-03-10 21:57:54.696774919 +0000 UTC m=+461.938655527" observedRunningTime="2026-03-10 21:57:55.363346383 +0000 UTC m=+462.605226991" watchObservedRunningTime="2026-03-10 21:57:55.369229574 +0000 UTC m=+462.611110182" Mar 10 21:57:56 crc kubenswrapper[4919]: I0310 21:57:56.319719 4919 generic.go:334] "Generic (PLEG): container finished" podID="3fa4689a-13af-4ed2-b9be-699f7bd519c1" containerID="1c646347a36e94c7651b629d077d8b085a2e6d9ebe989ed595e26af24b893637" exitCode=0 Mar 10 21:57:56 crc kubenswrapper[4919]: I0310 21:57:56.320639 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjwrf" event={"ID":"3fa4689a-13af-4ed2-b9be-699f7bd519c1","Type":"ContainerDied","Data":"1c646347a36e94c7651b629d077d8b085a2e6d9ebe989ed595e26af24b893637"} Mar 10 21:57:57 crc kubenswrapper[4919]: I0310 21:57:57.172267 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lxjws" Mar 10 21:57:57 crc kubenswrapper[4919]: I0310 21:57:57.228998 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lxrsj"] Mar 10 21:57:57 crc kubenswrapper[4919]: I0310 21:57:57.327532 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjwrf" event={"ID":"3fa4689a-13af-4ed2-b9be-699f7bd519c1","Type":"ContainerStarted","Data":"46115ac4be05e3ecbcb6f8fc246c93f4894bba6844a905c37f7b4eae5489ddb6"} Mar 10 21:57:57 crc kubenswrapper[4919]: I0310 21:57:57.344650 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qjwrf" podStartSLOduration=3.710831224 podStartE2EDuration="5.344629423s" podCreationTimestamp="2026-03-10 21:57:52 +0000 UTC" firstStartedPulling="2026-03-10 21:57:55.313000897 +0000 UTC m=+462.554881505" lastFinishedPulling="2026-03-10 21:57:56.946799096 +0000 UTC m=+464.188679704" observedRunningTime="2026-03-10 21:57:57.343239815 +0000 UTC m=+464.585120423" watchObservedRunningTime="2026-03-10 21:57:57.344629423 +0000 UTC m=+464.586510021" Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.175657 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.176905 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.177023 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.177649 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67ebb482e04a382aaf058b1f3caaeb6cdcd6b9d8d58f43f74fc1f837f6010a5f"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.177781 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://67ebb482e04a382aaf058b1f3caaeb6cdcd6b9d8d58f43f74fc1f837f6010a5f" gracePeriod=600 Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.339523 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="67ebb482e04a382aaf058b1f3caaeb6cdcd6b9d8d58f43f74fc1f837f6010a5f" exitCode=0 Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.339702 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"67ebb482e04a382aaf058b1f3caaeb6cdcd6b9d8d58f43f74fc1f837f6010a5f"} Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.339776 4919 scope.go:117] "RemoveContainer" containerID="9b645dc541f9bef5d9710345252c2ff48e91412f10d1c0c1bfaa06cf9e82210f" Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.558898 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.559240 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:57:59 crc kubenswrapper[4919]: I0310 21:57:59.605728 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.132098 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552998-p8v6c"] Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.133251 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552998-p8v6c" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.135579 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.136196 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.136492 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.147227 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552998-p8v6c"] Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.271868 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bvx\" (UniqueName: \"kubernetes.io/projected/25f645dc-37c8-4e19-bd00-561f70cf5bb3-kube-api-access-b8bvx\") pod \"auto-csr-approver-29552998-p8v6c\" (UID: \"25f645dc-37c8-4e19-bd00-561f70cf5bb3\") " pod="openshift-infra/auto-csr-approver-29552998-p8v6c" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.331868 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.331917 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.347024 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"92c733661fdcb9ed153563354608c2665c26bb0c70d3e55366b993a60576c067"} Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.373041 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8bvx\" (UniqueName: \"kubernetes.io/projected/25f645dc-37c8-4e19-bd00-561f70cf5bb3-kube-api-access-b8bvx\") pod \"auto-csr-approver-29552998-p8v6c\" (UID: \"25f645dc-37c8-4e19-bd00-561f70cf5bb3\") " pod="openshift-infra/auto-csr-approver-29552998-p8v6c" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.386138 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.401231 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8bvx\" (UniqueName: \"kubernetes.io/projected/25f645dc-37c8-4e19-bd00-561f70cf5bb3-kube-api-access-b8bvx\") pod \"auto-csr-approver-29552998-p8v6c\" (UID: \"25f645dc-37c8-4e19-bd00-561f70cf5bb3\") " pod="openshift-infra/auto-csr-approver-29552998-p8v6c" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.450724 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552998-p8v6c" Mar 10 21:58:00 crc kubenswrapper[4919]: I0310 21:58:00.856492 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552998-p8v6c"] Mar 10 21:58:00 crc kubenswrapper[4919]: W0310 21:58:00.867559 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25f645dc_37c8_4e19_bd00_561f70cf5bb3.slice/crio-c8cd03a974c76d027a4c7eeedfacbcde443f95bd3ebce46a82ba0d56416efcb3 WatchSource:0}: Error finding container c8cd03a974c76d027a4c7eeedfacbcde443f95bd3ebce46a82ba0d56416efcb3: Status 404 returned error can't find the container with id c8cd03a974c76d027a4c7eeedfacbcde443f95bd3ebce46a82ba0d56416efcb3 Mar 10 21:58:01 crc kubenswrapper[4919]: I0310 21:58:01.355702 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552998-p8v6c" event={"ID":"25f645dc-37c8-4e19-bd00-561f70cf5bb3","Type":"ContainerStarted","Data":"c8cd03a974c76d027a4c7eeedfacbcde443f95bd3ebce46a82ba0d56416efcb3"} Mar 10 21:58:01 crc kubenswrapper[4919]: I0310 21:58:01.378384 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b68ml" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="registry-server" probeResult="failure" output=< Mar 10 21:58:01 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 21:58:01 crc kubenswrapper[4919]: > Mar 10 21:58:01 crc kubenswrapper[4919]: I0310 21:58:01.668594 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:58:01 crc kubenswrapper[4919]: I0310 21:58:01.668670 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:58:01 crc kubenswrapper[4919]: I0310 21:58:01.745621 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:58:02 crc kubenswrapper[4919]: I0310 21:58:02.362045 4919 generic.go:334] "Generic (PLEG): container finished" podID="25f645dc-37c8-4e19-bd00-561f70cf5bb3" containerID="3f3fe6bea093a748bcdb8a949bd57db235ee77ede67dc38a660a1d39bd9455b6" exitCode=0 Mar 10 21:58:02 crc kubenswrapper[4919]: I0310 21:58:02.362140 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552998-p8v6c" event={"ID":"25f645dc-37c8-4e19-bd00-561f70cf5bb3","Type":"ContainerDied","Data":"3f3fe6bea093a748bcdb8a949bd57db235ee77ede67dc38a660a1d39bd9455b6"} Mar 10 21:58:02 crc kubenswrapper[4919]: I0310 21:58:02.410224 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tbwp6" Mar 10 21:58:02 crc kubenswrapper[4919]: I0310 21:58:02.757031 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:58:02 crc kubenswrapper[4919]: I0310 21:58:02.757096 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:58:02 crc kubenswrapper[4919]: I0310 21:58:02.796309 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:58:03 crc kubenswrapper[4919]: I0310 21:58:03.410152 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qjwrf" Mar 10 21:58:03 crc kubenswrapper[4919]: I0310 21:58:03.634763 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552998-p8v6c" Mar 10 21:58:03 crc kubenswrapper[4919]: I0310 21:58:03.718597 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8bvx\" (UniqueName: \"kubernetes.io/projected/25f645dc-37c8-4e19-bd00-561f70cf5bb3-kube-api-access-b8bvx\") pod \"25f645dc-37c8-4e19-bd00-561f70cf5bb3\" (UID: \"25f645dc-37c8-4e19-bd00-561f70cf5bb3\") " Mar 10 21:58:03 crc kubenswrapper[4919]: I0310 21:58:03.724814 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f645dc-37c8-4e19-bd00-561f70cf5bb3-kube-api-access-b8bvx" (OuterVolumeSpecName: "kube-api-access-b8bvx") pod "25f645dc-37c8-4e19-bd00-561f70cf5bb3" (UID: "25f645dc-37c8-4e19-bd00-561f70cf5bb3"). InnerVolumeSpecName "kube-api-access-b8bvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:58:03 crc kubenswrapper[4919]: I0310 21:58:03.819867 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8bvx\" (UniqueName: \"kubernetes.io/projected/25f645dc-37c8-4e19-bd00-561f70cf5bb3-kube-api-access-b8bvx\") on node \"crc\" DevicePath \"\"" Mar 10 21:58:04 crc kubenswrapper[4919]: I0310 21:58:04.373032 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552998-p8v6c" Mar 10 21:58:04 crc kubenswrapper[4919]: I0310 21:58:04.373029 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552998-p8v6c" event={"ID":"25f645dc-37c8-4e19-bd00-561f70cf5bb3","Type":"ContainerDied","Data":"c8cd03a974c76d027a4c7eeedfacbcde443f95bd3ebce46a82ba0d56416efcb3"} Mar 10 21:58:04 crc kubenswrapper[4919]: I0310 21:58:04.373354 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8cd03a974c76d027a4c7eeedfacbcde443f95bd3ebce46a82ba0d56416efcb3" Mar 10 21:58:04 crc kubenswrapper[4919]: I0310 21:58:04.701915 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552992-kp6wz"] Mar 10 21:58:04 crc kubenswrapper[4919]: I0310 21:58:04.705011 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552992-kp6wz"] Mar 10 21:58:05 crc kubenswrapper[4919]: I0310 21:58:05.485509 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f42fb7-eaa5-46d2-9443-81ad7a563cec" path="/var/lib/kubelet/pods/99f42fb7-eaa5-46d2-9443-81ad7a563cec/volumes" Mar 10 21:58:10 crc kubenswrapper[4919]: I0310 21:58:10.401677 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:58:10 crc kubenswrapper[4919]: I0310 21:58:10.471171 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.268593 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" podUID="8383a8d8-69ec-4706-8ea3-99ce91e5200c" containerName="registry" containerID="cri-o://6e025e0257d442e99fc3a627cf5828eae99cb82fa63b9f3eb91087181f10f013" gracePeriod=30 Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.495987 4919 generic.go:334] "Generic (PLEG): container finished" podID="8383a8d8-69ec-4706-8ea3-99ce91e5200c" containerID="6e025e0257d442e99fc3a627cf5828eae99cb82fa63b9f3eb91087181f10f013" exitCode=0 Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.496310 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" event={"ID":"8383a8d8-69ec-4706-8ea3-99ce91e5200c","Type":"ContainerDied","Data":"6e025e0257d442e99fc3a627cf5828eae99cb82fa63b9f3eb91087181f10f013"} Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.623618 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.653207 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-certificates\") pod \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.653294 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-trusted-ca\") pod \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.653331 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-bound-sa-token\") pod \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.653363 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8383a8d8-69ec-4706-8ea3-99ce91e5200c-ca-trust-extracted\") pod \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.653427 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqpbs\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-kube-api-access-rqpbs\") pod \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.653465 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-tls\") pod \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.653497 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8383a8d8-69ec-4706-8ea3-99ce91e5200c-installation-pull-secrets\") pod \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.653799 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\" (UID: \"8383a8d8-69ec-4706-8ea3-99ce91e5200c\") " Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.658829 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8383a8d8-69ec-4706-8ea3-99ce91e5200c" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.660167 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8383a8d8-69ec-4706-8ea3-99ce91e5200c" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.661876 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-kube-api-access-rqpbs" (OuterVolumeSpecName: "kube-api-access-rqpbs") pod "8383a8d8-69ec-4706-8ea3-99ce91e5200c" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c"). InnerVolumeSpecName "kube-api-access-rqpbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.661887 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8383a8d8-69ec-4706-8ea3-99ce91e5200c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8383a8d8-69ec-4706-8ea3-99ce91e5200c" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.661955 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8383a8d8-69ec-4706-8ea3-99ce91e5200c" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.664849 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8383a8d8-69ec-4706-8ea3-99ce91e5200c" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.665770 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8383a8d8-69ec-4706-8ea3-99ce91e5200c" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.679586 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8383a8d8-69ec-4706-8ea3-99ce91e5200c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8383a8d8-69ec-4706-8ea3-99ce91e5200c" (UID: "8383a8d8-69ec-4706-8ea3-99ce91e5200c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.756107 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqpbs\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-kube-api-access-rqpbs\") on node \"crc\" DevicePath \"\"" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.756429 4919 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.756445 4919 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8383a8d8-69ec-4706-8ea3-99ce91e5200c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.756453 4919 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.756462 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8383a8d8-69ec-4706-8ea3-99ce91e5200c-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.756497 4919 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8383a8d8-69ec-4706-8ea3-99ce91e5200c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 21:58:22 crc kubenswrapper[4919]: I0310 21:58:22.756506 4919 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8383a8d8-69ec-4706-8ea3-99ce91e5200c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 21:58:23 crc kubenswrapper[4919]: I0310 21:58:23.503071 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" event={"ID":"8383a8d8-69ec-4706-8ea3-99ce91e5200c","Type":"ContainerDied","Data":"34f11be4efe908c491f9f4d1c271e148443346071203a63fc067bb252478c2d8"} Mar 10 21:58:23 crc kubenswrapper[4919]: I0310 21:58:23.503124 4919 scope.go:117] "RemoveContainer" containerID="6e025e0257d442e99fc3a627cf5828eae99cb82fa63b9f3eb91087181f10f013" Mar 10 21:58:23 crc kubenswrapper[4919]: I0310 21:58:23.503248 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lxrsj" Mar 10 21:58:23 crc kubenswrapper[4919]: I0310 21:58:23.520751 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lxrsj"] Mar 10 21:58:23 crc kubenswrapper[4919]: I0310 21:58:23.530631 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lxrsj"] Mar 10 21:58:25 crc kubenswrapper[4919]: I0310 21:58:25.490526 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8383a8d8-69ec-4706-8ea3-99ce91e5200c" path="/var/lib/kubelet/pods/8383a8d8-69ec-4706-8ea3-99ce91e5200c/volumes" Mar 10 21:59:59 crc kubenswrapper[4919]: I0310 21:59:59.175619 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 21:59:59 crc kubenswrapper[4919]: I0310 21:59:59.176376 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.138437 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553000-l9xjh"] Mar 10 22:00:00 crc kubenswrapper[4919]: E0310 22:00:00.139114 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f645dc-37c8-4e19-bd00-561f70cf5bb3" containerName="oc" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.139147 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f645dc-37c8-4e19-bd00-561f70cf5bb3" containerName="oc" Mar 10 22:00:00 crc kubenswrapper[4919]: E0310 22:00:00.139166 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8383a8d8-69ec-4706-8ea3-99ce91e5200c" containerName="registry" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.139178 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8383a8d8-69ec-4706-8ea3-99ce91e5200c" containerName="registry" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.139338 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f645dc-37c8-4e19-bd00-561f70cf5bb3" containerName="oc" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.139384 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8383a8d8-69ec-4706-8ea3-99ce91e5200c" containerName="registry" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.140039 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553000-l9xjh" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.142283 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.142304 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.142713 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.144594 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk"] Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.145676 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.147378 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.147962 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.149160 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553000-l9xjh"] Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.163087 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk"] Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.340161 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6sth\" (UniqueName: \"kubernetes.io/projected/d4335ec6-e46a-4783-8d01-a1e84a33d2a7-kube-api-access-v6sth\") pod \"auto-csr-approver-29553000-l9xjh\" (UID: \"d4335ec6-e46a-4783-8d01-a1e84a33d2a7\") " pod="openshift-infra/auto-csr-approver-29553000-l9xjh" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.340220 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kshdx\" (UniqueName: \"kubernetes.io/projected/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-kube-api-access-kshdx\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.340268 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-config-volume\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.340324 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-secret-volume\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.441773 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-config-volume\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.441825 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-secret-volume\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.441930 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6sth\" (UniqueName: \"kubernetes.io/projected/d4335ec6-e46a-4783-8d01-a1e84a33d2a7-kube-api-access-v6sth\") pod \"auto-csr-approver-29553000-l9xjh\" (UID: \"d4335ec6-e46a-4783-8d01-a1e84a33d2a7\") " pod="openshift-infra/auto-csr-approver-29553000-l9xjh" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.441962 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kshdx\" (UniqueName: \"kubernetes.io/projected/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-kube-api-access-kshdx\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.443066 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-config-volume\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.454317 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-secret-volume\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.459891 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6sth\" (UniqueName: \"kubernetes.io/projected/d4335ec6-e46a-4783-8d01-a1e84a33d2a7-kube-api-access-v6sth\") pod \"auto-csr-approver-29553000-l9xjh\" (UID: \"d4335ec6-e46a-4783-8d01-a1e84a33d2a7\") " pod="openshift-infra/auto-csr-approver-29553000-l9xjh" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.462202 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553000-l9xjh" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.466314 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kshdx\" (UniqueName: \"kubernetes.io/projected/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-kube-api-access-kshdx\") pod \"collect-profiles-29553000-wwwdk\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.473598 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.701282 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk"] Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.883123 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553000-l9xjh"] Mar 10 22:00:00 crc kubenswrapper[4919]: I0310 22:00:00.886980 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:00:01 crc kubenswrapper[4919]: I0310 22:00:01.296648 4919 generic.go:334] "Generic (PLEG): container finished" podID="5f69f107-0134-45bd-b8f5-c272fd4b8fdc" containerID="67082a6347561a14fbe1f05df26dd9199717d40bdfd24a90a1441c085344f072" exitCode=0 Mar 10 22:00:01 crc kubenswrapper[4919]: I0310 22:00:01.296738 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" event={"ID":"5f69f107-0134-45bd-b8f5-c272fd4b8fdc","Type":"ContainerDied","Data":"67082a6347561a14fbe1f05df26dd9199717d40bdfd24a90a1441c085344f072"} Mar 10 22:00:01 crc kubenswrapper[4919]: I0310 22:00:01.296771 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" event={"ID":"5f69f107-0134-45bd-b8f5-c272fd4b8fdc","Type":"ContainerStarted","Data":"1157f62eea0437cdc045dc5164892b46e3b04f0f5a6b637e1cbde1a6bfd7f2a9"} Mar 10 22:00:01 crc kubenswrapper[4919]: I0310 22:00:01.298067 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553000-l9xjh" event={"ID":"d4335ec6-e46a-4783-8d01-a1e84a33d2a7","Type":"ContainerStarted","Data":"f4dcb01850996892c5760c323fbc2abe267378c65fff57800ad368c812eac50d"} Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.566990 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.669036 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kshdx\" (UniqueName: \"kubernetes.io/projected/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-kube-api-access-kshdx\") pod \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.669153 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-secret-volume\") pod \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.669210 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-config-volume\") pod \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\" (UID: \"5f69f107-0134-45bd-b8f5-c272fd4b8fdc\") " Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.669894 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-config-volume" (OuterVolumeSpecName: "config-volume") pod "5f69f107-0134-45bd-b8f5-c272fd4b8fdc" (UID: "5f69f107-0134-45bd-b8f5-c272fd4b8fdc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.673893 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5f69f107-0134-45bd-b8f5-c272fd4b8fdc" (UID: "5f69f107-0134-45bd-b8f5-c272fd4b8fdc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.674916 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-kube-api-access-kshdx" (OuterVolumeSpecName: "kube-api-access-kshdx") pod "5f69f107-0134-45bd-b8f5-c272fd4b8fdc" (UID: "5f69f107-0134-45bd-b8f5-c272fd4b8fdc"). InnerVolumeSpecName "kube-api-access-kshdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.770539 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kshdx\" (UniqueName: \"kubernetes.io/projected/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-kube-api-access-kshdx\") on node \"crc\" DevicePath \"\"" Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.770612 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 22:00:02 crc kubenswrapper[4919]: I0310 22:00:02.770625 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f69f107-0134-45bd-b8f5-c272fd4b8fdc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 22:00:03 crc kubenswrapper[4919]: I0310 22:00:03.318527 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" event={"ID":"5f69f107-0134-45bd-b8f5-c272fd4b8fdc","Type":"ContainerDied","Data":"1157f62eea0437cdc045dc5164892b46e3b04f0f5a6b637e1cbde1a6bfd7f2a9"} Mar 10 22:00:03 crc kubenswrapper[4919]: I0310 22:00:03.318882 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1157f62eea0437cdc045dc5164892b46e3b04f0f5a6b637e1cbde1a6bfd7f2a9" Mar 10 22:00:03 crc kubenswrapper[4919]: I0310 22:00:03.318668 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk" Mar 10 22:00:09 crc kubenswrapper[4919]: I0310 22:00:09.357048 4919 generic.go:334] "Generic (PLEG): container finished" podID="d4335ec6-e46a-4783-8d01-a1e84a33d2a7" containerID="589255cf712a6dec416e8cbd1ec5d52123e1297c21955116e955d4fb5773ff0c" exitCode=0 Mar 10 22:00:09 crc kubenswrapper[4919]: I0310 22:00:09.357168 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553000-l9xjh" event={"ID":"d4335ec6-e46a-4783-8d01-a1e84a33d2a7","Type":"ContainerDied","Data":"589255cf712a6dec416e8cbd1ec5d52123e1297c21955116e955d4fb5773ff0c"} Mar 10 22:00:10 crc kubenswrapper[4919]: I0310 22:00:10.738330 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553000-l9xjh" Mar 10 22:00:10 crc kubenswrapper[4919]: I0310 22:00:10.874742 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6sth\" (UniqueName: \"kubernetes.io/projected/d4335ec6-e46a-4783-8d01-a1e84a33d2a7-kube-api-access-v6sth\") pod \"d4335ec6-e46a-4783-8d01-a1e84a33d2a7\" (UID: \"d4335ec6-e46a-4783-8d01-a1e84a33d2a7\") " Mar 10 22:00:10 crc kubenswrapper[4919]: I0310 22:00:10.883067 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4335ec6-e46a-4783-8d01-a1e84a33d2a7-kube-api-access-v6sth" (OuterVolumeSpecName: "kube-api-access-v6sth") pod "d4335ec6-e46a-4783-8d01-a1e84a33d2a7" (UID: "d4335ec6-e46a-4783-8d01-a1e84a33d2a7"). InnerVolumeSpecName "kube-api-access-v6sth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:00:10 crc kubenswrapper[4919]: I0310 22:00:10.976250 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6sth\" (UniqueName: \"kubernetes.io/projected/d4335ec6-e46a-4783-8d01-a1e84a33d2a7-kube-api-access-v6sth\") on node \"crc\" DevicePath \"\"" Mar 10 22:00:11 crc kubenswrapper[4919]: I0310 22:00:11.373772 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553000-l9xjh" event={"ID":"d4335ec6-e46a-4783-8d01-a1e84a33d2a7","Type":"ContainerDied","Data":"f4dcb01850996892c5760c323fbc2abe267378c65fff57800ad368c812eac50d"} Mar 10 22:00:11 crc kubenswrapper[4919]: I0310 22:00:11.373813 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4dcb01850996892c5760c323fbc2abe267378c65fff57800ad368c812eac50d" Mar 10 22:00:11 crc kubenswrapper[4919]: I0310 22:00:11.373881 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553000-l9xjh" Mar 10 22:00:11 crc kubenswrapper[4919]: I0310 22:00:11.807217 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552994-rvxmh"] Mar 10 22:00:11 crc kubenswrapper[4919]: I0310 22:00:11.810613 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552994-rvxmh"] Mar 10 22:00:13 crc kubenswrapper[4919]: I0310 22:00:13.488047 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac0bc08-6186-43fb-bebe-036c98331599" path="/var/lib/kubelet/pods/cac0bc08-6186-43fb-bebe-036c98331599/volumes" Mar 10 22:00:29 crc kubenswrapper[4919]: I0310 22:00:29.176122 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:00:29 crc kubenswrapper[4919]: I0310 22:00:29.177050 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.176745 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.177289 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.177346 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.177970 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92c733661fdcb9ed153563354608c2665c26bb0c70d3e55366b993a60576c067"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.178036 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://92c733661fdcb9ed153563354608c2665c26bb0c70d3e55366b993a60576c067" gracePeriod=600 Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.703818 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="92c733661fdcb9ed153563354608c2665c26bb0c70d3e55366b993a60576c067" exitCode=0 Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.704555 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"92c733661fdcb9ed153563354608c2665c26bb0c70d3e55366b993a60576c067"} Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.704746 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"1b4aa5b33b0728a2f664ae32a561328aa084e55b7c24f15b646a35b9a4014c13"} Mar 10 22:00:59 crc kubenswrapper[4919]: I0310 22:00:59.704789 4919 scope.go:117] "RemoveContainer" containerID="67ebb482e04a382aaf058b1f3caaeb6cdcd6b9d8d58f43f74fc1f837f6010a5f" Mar 10 22:01:23 crc kubenswrapper[4919]: I0310 22:01:23.971152 4919 scope.go:117] "RemoveContainer" containerID="a088399da8fc4e94e5d64d7469a3761c93b686bea11cc5017bc3f52bd0538e1b" Mar 10 22:01:24 crc kubenswrapper[4919]: I0310 22:01:24.014031 4919 scope.go:117] "RemoveContainer" containerID="55595d6a43e0adabea5d56c99bc7518e004a79b1114f60522b987df8fb6a712c" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.139753 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553002-sskqr"] Mar 10 22:02:00 crc kubenswrapper[4919]: E0310 22:02:00.140558 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f69f107-0134-45bd-b8f5-c272fd4b8fdc" containerName="collect-profiles" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.140575 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f69f107-0134-45bd-b8f5-c272fd4b8fdc" containerName="collect-profiles" Mar 10 22:02:00 crc kubenswrapper[4919]: E0310 22:02:00.140600 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4335ec6-e46a-4783-8d01-a1e84a33d2a7" containerName="oc" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.140607 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4335ec6-e46a-4783-8d01-a1e84a33d2a7" containerName="oc" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.140713 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4335ec6-e46a-4783-8d01-a1e84a33d2a7" containerName="oc" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.140736 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f69f107-0134-45bd-b8f5-c272fd4b8fdc" containerName="collect-profiles" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.141223 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553002-sskqr" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.142866 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.143841 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.144107 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.149800 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553002-sskqr"] Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.268314 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpcd2\" (UniqueName: \"kubernetes.io/projected/572158b3-be80-481d-84ff-0f1d9759aea8-kube-api-access-kpcd2\") pod \"auto-csr-approver-29553002-sskqr\" (UID: \"572158b3-be80-481d-84ff-0f1d9759aea8\") " pod="openshift-infra/auto-csr-approver-29553002-sskqr" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.369024 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpcd2\" (UniqueName: \"kubernetes.io/projected/572158b3-be80-481d-84ff-0f1d9759aea8-kube-api-access-kpcd2\") pod \"auto-csr-approver-29553002-sskqr\" (UID: \"572158b3-be80-481d-84ff-0f1d9759aea8\") " pod="openshift-infra/auto-csr-approver-29553002-sskqr" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.396067 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpcd2\" (UniqueName: \"kubernetes.io/projected/572158b3-be80-481d-84ff-0f1d9759aea8-kube-api-access-kpcd2\") pod \"auto-csr-approver-29553002-sskqr\" (UID: \"572158b3-be80-481d-84ff-0f1d9759aea8\") " pod="openshift-infra/auto-csr-approver-29553002-sskqr" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.466517 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553002-sskqr" Mar 10 22:02:00 crc kubenswrapper[4919]: I0310 22:02:00.681006 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553002-sskqr"] Mar 10 22:02:01 crc kubenswrapper[4919]: I0310 22:02:01.216083 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553002-sskqr" event={"ID":"572158b3-be80-481d-84ff-0f1d9759aea8","Type":"ContainerStarted","Data":"5ddde6e38149f29e7901580d5cf5a33412dd8b5da6c6843489bbe16f5b568270"} Mar 10 22:02:02 crc kubenswrapper[4919]: I0310 22:02:02.221805 4919 generic.go:334] "Generic (PLEG): container finished" podID="572158b3-be80-481d-84ff-0f1d9759aea8" containerID="0c10e23c9b2882b4edd4df3ad3cf23e240305d6863d1e5596de8f4e2791db100" exitCode=0 Mar 10 22:02:02 crc kubenswrapper[4919]: I0310 22:02:02.221864 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553002-sskqr" event={"ID":"572158b3-be80-481d-84ff-0f1d9759aea8","Type":"ContainerDied","Data":"0c10e23c9b2882b4edd4df3ad3cf23e240305d6863d1e5596de8f4e2791db100"} Mar 10 22:02:03 crc kubenswrapper[4919]: I0310 22:02:03.444350 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553002-sskqr" Mar 10 22:02:03 crc kubenswrapper[4919]: I0310 22:02:03.609066 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpcd2\" (UniqueName: \"kubernetes.io/projected/572158b3-be80-481d-84ff-0f1d9759aea8-kube-api-access-kpcd2\") pod \"572158b3-be80-481d-84ff-0f1d9759aea8\" (UID: \"572158b3-be80-481d-84ff-0f1d9759aea8\") " Mar 10 22:02:03 crc kubenswrapper[4919]: I0310 22:02:03.615013 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/572158b3-be80-481d-84ff-0f1d9759aea8-kube-api-access-kpcd2" (OuterVolumeSpecName: "kube-api-access-kpcd2") pod "572158b3-be80-481d-84ff-0f1d9759aea8" (UID: "572158b3-be80-481d-84ff-0f1d9759aea8"). InnerVolumeSpecName "kube-api-access-kpcd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:02:03 crc kubenswrapper[4919]: I0310 22:02:03.711136 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpcd2\" (UniqueName: \"kubernetes.io/projected/572158b3-be80-481d-84ff-0f1d9759aea8-kube-api-access-kpcd2\") on node \"crc\" DevicePath \"\"" Mar 10 22:02:04 crc kubenswrapper[4919]: I0310 22:02:04.242791 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553002-sskqr" event={"ID":"572158b3-be80-481d-84ff-0f1d9759aea8","Type":"ContainerDied","Data":"5ddde6e38149f29e7901580d5cf5a33412dd8b5da6c6843489bbe16f5b568270"} Mar 10 22:02:04 crc kubenswrapper[4919]: I0310 22:02:04.242876 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ddde6e38149f29e7901580d5cf5a33412dd8b5da6c6843489bbe16f5b568270" Mar 10 22:02:04 crc kubenswrapper[4919]: I0310 22:02:04.242890 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553002-sskqr" Mar 10 22:02:04 crc kubenswrapper[4919]: I0310 22:02:04.517499 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552996-lcsjs"] Mar 10 22:02:04 crc kubenswrapper[4919]: I0310 22:02:04.521488 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552996-lcsjs"] Mar 10 22:02:05 crc kubenswrapper[4919]: I0310 22:02:05.487933 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9af79e-aa31-499c-b948-7e05f1bf1f7a" path="/var/lib/kubelet/pods/1b9af79e-aa31-499c-b948-7e05f1bf1f7a/volumes" Mar 10 22:02:24 crc kubenswrapper[4919]: I0310 22:02:24.077610 4919 scope.go:117] "RemoveContainer" containerID="f0dd855d39a25655496478ab54dc29196521a8f74a3a5d4726502eb56102cc15" Mar 10 22:02:59 crc kubenswrapper[4919]: I0310 22:02:59.175984 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:02:59 crc kubenswrapper[4919]: I0310 22:02:59.176745 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:03:29 crc kubenswrapper[4919]: I0310 22:03:29.176285 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:03:29 crc kubenswrapper[4919]: I0310 22:03:29.177088 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:03:59 crc kubenswrapper[4919]: I0310 22:03:59.175922 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:03:59 crc kubenswrapper[4919]: I0310 22:03:59.176745 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:03:59 crc kubenswrapper[4919]: I0310 22:03:59.176810 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:03:59 crc kubenswrapper[4919]: I0310 22:03:59.177631 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b4aa5b33b0728a2f664ae32a561328aa084e55b7c24f15b646a35b9a4014c13"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:03:59 crc kubenswrapper[4919]: I0310 22:03:59.177725 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://1b4aa5b33b0728a2f664ae32a561328aa084e55b7c24f15b646a35b9a4014c13" gracePeriod=600 Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:03:59.998998 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="1b4aa5b33b0728a2f664ae32a561328aa084e55b7c24f15b646a35b9a4014c13" exitCode=0 Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:03:59.999094 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"1b4aa5b33b0728a2f664ae32a561328aa084e55b7c24f15b646a35b9a4014c13"} Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.000018 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"5c0f64b8b2ef3b8561ca8ab7ca9e89321df88a87f472fe3592188e0b92020ed2"} Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.000065 4919 scope.go:117] "RemoveContainer" containerID="92c733661fdcb9ed153563354608c2665c26bb0c70d3e55366b993a60576c067" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.146238 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553004-2n2t4"] Mar 10 22:04:00 crc kubenswrapper[4919]: E0310 22:04:00.146579 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572158b3-be80-481d-84ff-0f1d9759aea8" containerName="oc" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.146601 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="572158b3-be80-481d-84ff-0f1d9759aea8" containerName="oc" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.146788 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="572158b3-be80-481d-84ff-0f1d9759aea8" containerName="oc" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.147350 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.149349 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.150371 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.150747 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.161882 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553004-2n2t4"] Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.223900 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn69s\" (UniqueName: \"kubernetes.io/projected/572d0cfe-f016-44c7-baa9-83166b19a691-kube-api-access-vn69s\") pod \"auto-csr-approver-29553004-2n2t4\" (UID: \"572d0cfe-f016-44c7-baa9-83166b19a691\") " pod="openshift-infra/auto-csr-approver-29553004-2n2t4" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.324965 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn69s\" (UniqueName: \"kubernetes.io/projected/572d0cfe-f016-44c7-baa9-83166b19a691-kube-api-access-vn69s\") pod \"auto-csr-approver-29553004-2n2t4\" (UID: \"572d0cfe-f016-44c7-baa9-83166b19a691\") " pod="openshift-infra/auto-csr-approver-29553004-2n2t4" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.348636 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn69s\" (UniqueName: \"kubernetes.io/projected/572d0cfe-f016-44c7-baa9-83166b19a691-kube-api-access-vn69s\") pod \"auto-csr-approver-29553004-2n2t4\" (UID: \"572d0cfe-f016-44c7-baa9-83166b19a691\") " pod="openshift-infra/auto-csr-approver-29553004-2n2t4" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.479962 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" Mar 10 22:04:00 crc kubenswrapper[4919]: I0310 22:04:00.781213 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553004-2n2t4"] Mar 10 22:04:01 crc kubenswrapper[4919]: I0310 22:04:01.009443 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" event={"ID":"572d0cfe-f016-44c7-baa9-83166b19a691","Type":"ContainerStarted","Data":"bd95f5300fe19014cee80a359b58275f609426836f66f89ed6ddb3bf3e917945"} Mar 10 22:04:02 crc kubenswrapper[4919]: I0310 22:04:02.018045 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" event={"ID":"572d0cfe-f016-44c7-baa9-83166b19a691","Type":"ContainerStarted","Data":"1f9fd9c850ec222a6a45fed07ceeb8945bc2d1922898ba7879522d01facf1fa3"} Mar 10 22:04:02 crc kubenswrapper[4919]: I0310 22:04:02.036705 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" podStartSLOduration=1.193066702 podStartE2EDuration="2.036690132s" podCreationTimestamp="2026-03-10 22:04:00 +0000 UTC" firstStartedPulling="2026-03-10 22:04:00.789039954 +0000 UTC m=+828.030920562" lastFinishedPulling="2026-03-10 22:04:01.632663374 +0000 UTC m=+828.874543992" observedRunningTime="2026-03-10 22:04:02.032741115 +0000 UTC m=+829.274621723" watchObservedRunningTime="2026-03-10 22:04:02.036690132 +0000 UTC m=+829.278570740" Mar 10 22:04:03 crc kubenswrapper[4919]: I0310 22:04:03.026178 4919 generic.go:334] "Generic (PLEG): container finished" podID="572d0cfe-f016-44c7-baa9-83166b19a691" containerID="1f9fd9c850ec222a6a45fed07ceeb8945bc2d1922898ba7879522d01facf1fa3" exitCode=0 Mar 10 22:04:03 crc kubenswrapper[4919]: I0310 22:04:03.026829 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" event={"ID":"572d0cfe-f016-44c7-baa9-83166b19a691","Type":"ContainerDied","Data":"1f9fd9c850ec222a6a45fed07ceeb8945bc2d1922898ba7879522d01facf1fa3"} Mar 10 22:04:04 crc kubenswrapper[4919]: I0310 22:04:04.296993 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" Mar 10 22:04:04 crc kubenswrapper[4919]: I0310 22:04:04.409962 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn69s\" (UniqueName: \"kubernetes.io/projected/572d0cfe-f016-44c7-baa9-83166b19a691-kube-api-access-vn69s\") pod \"572d0cfe-f016-44c7-baa9-83166b19a691\" (UID: \"572d0cfe-f016-44c7-baa9-83166b19a691\") " Mar 10 22:04:04 crc kubenswrapper[4919]: I0310 22:04:04.417218 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/572d0cfe-f016-44c7-baa9-83166b19a691-kube-api-access-vn69s" (OuterVolumeSpecName: "kube-api-access-vn69s") pod "572d0cfe-f016-44c7-baa9-83166b19a691" (UID: "572d0cfe-f016-44c7-baa9-83166b19a691"). InnerVolumeSpecName "kube-api-access-vn69s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:04:04 crc kubenswrapper[4919]: I0310 22:04:04.511784 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn69s\" (UniqueName: \"kubernetes.io/projected/572d0cfe-f016-44c7-baa9-83166b19a691-kube-api-access-vn69s\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:05 crc kubenswrapper[4919]: I0310 22:04:05.042044 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" event={"ID":"572d0cfe-f016-44c7-baa9-83166b19a691","Type":"ContainerDied","Data":"bd95f5300fe19014cee80a359b58275f609426836f66f89ed6ddb3bf3e917945"} Mar 10 22:04:05 crc kubenswrapper[4919]: I0310 22:04:05.042213 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd95f5300fe19014cee80a359b58275f609426836f66f89ed6ddb3bf3e917945" Mar 10 22:04:05 crc kubenswrapper[4919]: I0310 22:04:05.042161 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553004-2n2t4" Mar 10 22:04:05 crc kubenswrapper[4919]: I0310 22:04:05.098314 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552998-p8v6c"] Mar 10 22:04:05 crc kubenswrapper[4919]: I0310 22:04:05.104728 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552998-p8v6c"] Mar 10 22:04:05 crc kubenswrapper[4919]: I0310 22:04:05.491723 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f645dc-37c8-4e19-bd00-561f70cf5bb3" path="/var/lib/kubelet/pods/25f645dc-37c8-4e19-bd00-561f70cf5bb3/volumes" Mar 10 22:04:08 crc kubenswrapper[4919]: I0310 22:04:08.201141 4919 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 22:04:24 crc kubenswrapper[4919]: I0310 22:04:24.161011 4919 scope.go:117] "RemoveContainer" containerID="3f3fe6bea093a748bcdb8a949bd57db235ee77ede67dc38a660a1d39bd9455b6" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.515686 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4dp67"] Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.516956 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovn-controller" containerID="cri-o://a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2" gracePeriod=30 Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.517132 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="sbdb" containerID="cri-o://81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325" gracePeriod=30 Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.517209 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="nbdb" containerID="cri-o://4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26" gracePeriod=30 Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.517273 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="northd" containerID="cri-o://648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580" gracePeriod=30 Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.517333 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736" gracePeriod=30 Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.517422 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kube-rbac-proxy-node" containerID="cri-o://ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0" gracePeriod=30 Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.517490 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovn-acl-logging" containerID="cri-o://06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91" gracePeriod=30 Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.595426 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" containerID="cri-o://55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23" gracePeriod=30 Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.862968 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/3.log" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.866083 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovn-acl-logging/0.log" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.866565 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovn-controller/0.log" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.867006 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.920696 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bf975"] Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.920889 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kube-rbac-proxy-node" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.920902 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kube-rbac-proxy-node" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.920914 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="sbdb" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.920921 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="sbdb" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.920937 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.920946 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.920959 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.920966 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.920974 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovn-acl-logging" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.920982 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovn-acl-logging" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.920992 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572d0cfe-f016-44c7-baa9-83166b19a691" containerName="oc" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921000 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="572d0cfe-f016-44c7-baa9-83166b19a691" containerName="oc" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.921010 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="northd" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921016 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="northd" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.921026 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921033 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.921040 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovn-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921046 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovn-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.921053 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="nbdb" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921059 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="nbdb" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.921068 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921074 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.921082 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921087 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.921094 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kubecfg-setup" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921100 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kubecfg-setup" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921206 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921217 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921225 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovn-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921234 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921240 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="nbdb" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921248 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovn-acl-logging" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921255 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="sbdb" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921261 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="northd" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921269 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="572d0cfe-f016-44c7-baa9-83166b19a691" containerName="oc" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921277 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921283 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kube-rbac-proxy-node" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921291 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 22:04:26 crc kubenswrapper[4919]: E0310 22:04:26.921373 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921379 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.921508 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerName="ovnkube-controller" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.923056 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994414 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-netd\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994501 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-bin\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994530 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-log-socket\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994548 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994599 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994595 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994625 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-log-socket" (OuterVolumeSpecName: "log-socket") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994645 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-netns\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994727 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-openvswitch\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994654 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994689 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994746 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-slash\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994766 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-slash" (OuterVolumeSpecName: "host-slash") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994804 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-config\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994849 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-env-overrides\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994837 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994885 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-systemd\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994917 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-kubelet\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994956 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovn-node-metrics-cert\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.994994 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-ovn-kubernetes\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995033 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5rvw\" (UniqueName: \"kubernetes.io/projected/a2e7c6fb-9e33-441d-9197-719929eb9e21-kube-api-access-c5rvw\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995066 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-systemd-units\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995107 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-var-lib-openvswitch\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995146 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-etc-openvswitch\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995177 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-ovn\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995211 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-node-log\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995241 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-script-lib\") pod \"a2e7c6fb-9e33-441d-9197-719929eb9e21\" (UID: \"a2e7c6fb-9e33-441d-9197-719929eb9e21\") " Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995326 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995358 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995380 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995473 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995541 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995584 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995579 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995599 4919 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995624 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995632 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-node-log" (OuterVolumeSpecName: "node-log") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995645 4919 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995668 4919 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995685 4919 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995701 4919 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995719 4919 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995734 4919 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995749 4919 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995782 4919 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995798 4919 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.995818 4919 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:26 crc kubenswrapper[4919]: I0310 22:04:26.996058 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.000707 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e7c6fb-9e33-441d-9197-719929eb9e21-kube-api-access-c5rvw" (OuterVolumeSpecName: "kube-api-access-c5rvw") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "kube-api-access-c5rvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.001014 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.007650 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a2e7c6fb-9e33-441d-9197-719929eb9e21" (UID: "a2e7c6fb-9e33-441d-9197-719929eb9e21"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.097507 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-kubelet\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.097591 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-env-overrides\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.097627 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-var-lib-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.097661 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.097719 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-run-netns\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.097759 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-run-ovn-kubernetes\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.097793 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-etc-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.097820 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-log-socket\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098030 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-systemd-units\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098115 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovnkube-config\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098188 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-cni-bin\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098289 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovn-node-metrics-cert\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098359 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098478 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-systemd\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098561 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-node-log\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098611 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s54l7\" (UniqueName: \"kubernetes.io/projected/5e58dc1e-ca00-426e-ba2c-42b5f8184915-kube-api-access-s54l7\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098677 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-cni-netd\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098717 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-ovn\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098767 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-slash\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098814 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovnkube-script-lib\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098908 4919 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098936 4919 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098963 4919 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.098986 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5rvw\" (UniqueName: \"kubernetes.io/projected/a2e7c6fb-9e33-441d-9197-719929eb9e21-kube-api-access-c5rvw\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.099010 4919 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.099035 4919 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.099057 4919 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.099080 4919 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2e7c6fb-9e33-441d-9197-719929eb9e21-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.099102 4919 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2e7c6fb-9e33-441d-9197-719929eb9e21-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199380 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-cni-netd\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199449 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-ovn\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199481 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-slash\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199503 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovnkube-script-lib\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199531 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-kubelet\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199553 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-var-lib-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199576 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-env-overrides\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199598 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199630 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-run-netns\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199835 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-run-ovn-kubernetes\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199859 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-etc-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199881 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-log-socket\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199905 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-systemd-units\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199930 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovnkube-config\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199960 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-cni-bin\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.199982 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovn-node-metrics-cert\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.200009 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.200037 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-systemd\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.200058 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-node-log\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.200081 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s54l7\" (UniqueName: \"kubernetes.io/projected/5e58dc1e-ca00-426e-ba2c-42b5f8184915-kube-api-access-s54l7\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.200515 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-cni-netd\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.200560 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-ovn\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.200589 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-slash\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.201334 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovnkube-script-lib\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.201406 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-kubelet\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.201438 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-var-lib-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.202218 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-env-overrides\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206502 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206543 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-run-ovn-kubernetes\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206698 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-systemd-units\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206630 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-run-netns\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206753 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206785 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-run-systemd\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206783 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-host-cni-bin\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206866 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-log-socket\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206913 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-etc-openvswitch\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.206934 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5e58dc1e-ca00-426e-ba2c-42b5f8184915-node-log\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.207744 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovnkube-config\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.211981 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e58dc1e-ca00-426e-ba2c-42b5f8184915-ovn-node-metrics-cert\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.223273 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/2.log" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.223651 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/1.log" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.223683 4919 generic.go:334] "Generic (PLEG): container finished" podID="6a5db7c3-2a96-4030-8c88-5d82d325b62d" containerID="45570c9a8a9f7a51b2de68cee5cd8f8ae4fc089c9db6203a5d0b78f77094f15a" exitCode=2 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.223727 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbw8v" event={"ID":"6a5db7c3-2a96-4030-8c88-5d82d325b62d","Type":"ContainerDied","Data":"45570c9a8a9f7a51b2de68cee5cd8f8ae4fc089c9db6203a5d0b78f77094f15a"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.223758 4919 scope.go:117] "RemoveContainer" containerID="0ea0659cf18bee888c2408100c1de192eb8da3991c3158d708c3083d31a61bdc" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.224149 4919 scope.go:117] "RemoveContainer" containerID="45570c9a8a9f7a51b2de68cee5cd8f8ae4fc089c9db6203a5d0b78f77094f15a" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.229610 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovnkube-controller/3.log" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.234423 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s54l7\" (UniqueName: \"kubernetes.io/projected/5e58dc1e-ca00-426e-ba2c-42b5f8184915-kube-api-access-s54l7\") pod \"ovnkube-node-bf975\" (UID: \"5e58dc1e-ca00-426e-ba2c-42b5f8184915\") " pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.234511 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovn-acl-logging/0.log" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.235342 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4dp67_a2e7c6fb-9e33-441d-9197-719929eb9e21/ovn-controller/0.log" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.235928 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23" exitCode=0 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.235964 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325" exitCode=0 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.235975 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26" exitCode=0 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.235988 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580" exitCode=0 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.235998 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736" exitCode=0 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236009 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0" exitCode=0 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236019 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91" exitCode=143 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236032 4919 generic.go:334] "Generic (PLEG): container finished" podID="a2e7c6fb-9e33-441d-9197-719929eb9e21" containerID="a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2" exitCode=143 Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236064 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236093 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236105 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236122 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236135 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236148 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236163 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236180 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236195 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236203 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236211 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236219 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236226 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236233 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236240 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236247 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236254 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236265 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236276 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236285 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236291 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236298 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236305 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236313 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236323 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236332 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236340 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236347 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236357 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236368 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236376 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236385 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236414 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236422 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236430 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236438 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236445 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236454 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236462 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236473 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4dp67" event={"ID":"a2e7c6fb-9e33-441d-9197-719929eb9e21","Type":"ContainerDied","Data":"d527ec7ec526f114e00e9b88c707cae3e833fb2d8c9853761e82310aa9ca2239"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236486 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236495 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236503 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236511 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236519 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236528 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236538 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236548 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236559 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.236572 4919 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.240139 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.262522 4919 scope.go:117] "RemoveContainer" containerID="55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.284756 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.292513 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4dp67"] Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.298792 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4dp67"] Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.304808 4919 scope.go:117] "RemoveContainer" containerID="81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.338699 4919 scope.go:117] "RemoveContainer" containerID="4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.361730 4919 scope.go:117] "RemoveContainer" containerID="648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.378221 4919 scope.go:117] "RemoveContainer" containerID="c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.400405 4919 scope.go:117] "RemoveContainer" containerID="ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.420195 4919 scope.go:117] "RemoveContainer" containerID="06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.444268 4919 scope.go:117] "RemoveContainer" containerID="a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.509536 4919 scope.go:117] "RemoveContainer" containerID="9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.521471 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e7c6fb-9e33-441d-9197-719929eb9e21" path="/var/lib/kubelet/pods/a2e7c6fb-9e33-441d-9197-719929eb9e21/volumes" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.533440 4919 scope.go:117] "RemoveContainer" containerID="55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.534419 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": container with ID starting with 55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23 not found: ID does not exist" containerID="55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.534460 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} err="failed to get container status \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": rpc error: code = NotFound desc = could not find container \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": container with ID starting with 55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.534487 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.535417 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": container with ID starting with 8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880 not found: ID does not exist" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.535449 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} err="failed to get container status \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": rpc error: code = NotFound desc = could not find container \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": container with ID starting with 8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.535471 4919 scope.go:117] "RemoveContainer" containerID="81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.535760 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": container with ID starting with 81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325 not found: ID does not exist" containerID="81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.535789 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} err="failed to get container status \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": rpc error: code = NotFound desc = could not find container \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": container with ID starting with 81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.535808 4919 scope.go:117] "RemoveContainer" containerID="4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.536222 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": container with ID starting with 4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26 not found: ID does not exist" containerID="4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.536246 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} err="failed to get container status \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": rpc error: code = NotFound desc = could not find container \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": container with ID starting with 4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.536261 4919 scope.go:117] "RemoveContainer" containerID="648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.536578 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": container with ID starting with 648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580 not found: ID does not exist" containerID="648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.536619 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} err="failed to get container status \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": rpc error: code = NotFound desc = could not find container \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": container with ID starting with 648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.536646 4919 scope.go:117] "RemoveContainer" containerID="c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.536966 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": container with ID starting with c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736 not found: ID does not exist" containerID="c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.536991 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} err="failed to get container status \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": rpc error: code = NotFound desc = could not find container \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": container with ID starting with c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.537009 4919 scope.go:117] "RemoveContainer" containerID="ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.537320 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": container with ID starting with ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0 not found: ID does not exist" containerID="ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.537343 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} err="failed to get container status \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": rpc error: code = NotFound desc = could not find container \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": container with ID starting with ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.537356 4919 scope.go:117] "RemoveContainer" containerID="06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.537748 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": container with ID starting with 06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91 not found: ID does not exist" containerID="06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.537769 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} err="failed to get container status \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": rpc error: code = NotFound desc = could not find container \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": container with ID starting with 06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.537782 4919 scope.go:117] "RemoveContainer" containerID="a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.538021 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": container with ID starting with a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2 not found: ID does not exist" containerID="a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.538045 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} err="failed to get container status \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": rpc error: code = NotFound desc = could not find container \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": container with ID starting with a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.538060 4919 scope.go:117] "RemoveContainer" containerID="9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493" Mar 10 22:04:27 crc kubenswrapper[4919]: E0310 22:04:27.538346 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": container with ID starting with 9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493 not found: ID does not exist" containerID="9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.538371 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} err="failed to get container status \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": rpc error: code = NotFound desc = could not find container \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": container with ID starting with 9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.538385 4919 scope.go:117] "RemoveContainer" containerID="55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.538637 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} err="failed to get container status \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": rpc error: code = NotFound desc = could not find container \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": container with ID starting with 55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.538658 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.539934 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} err="failed to get container status \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": rpc error: code = NotFound desc = could not find container \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": container with ID starting with 8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.539953 4919 scope.go:117] "RemoveContainer" containerID="81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.540210 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} err="failed to get container status \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": rpc error: code = NotFound desc = could not find container \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": container with ID starting with 81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.540240 4919 scope.go:117] "RemoveContainer" containerID="4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.540545 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} err="failed to get container status \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": rpc error: code = NotFound desc = could not find container \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": container with ID starting with 4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.540572 4919 scope.go:117] "RemoveContainer" containerID="648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.540828 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} err="failed to get container status \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": rpc error: code = NotFound desc = could not find container \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": container with ID starting with 648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.540847 4919 scope.go:117] "RemoveContainer" containerID="c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.541087 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} err="failed to get container status \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": rpc error: code = NotFound desc = could not find container \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": container with ID starting with c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.541109 4919 scope.go:117] "RemoveContainer" containerID="ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.541442 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} err="failed to get container status \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": rpc error: code = NotFound desc = could not find container \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": container with ID starting with ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.541464 4919 scope.go:117] "RemoveContainer" containerID="06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.541914 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} err="failed to get container status \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": rpc error: code = NotFound desc = could not find container \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": container with ID starting with 06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.541945 4919 scope.go:117] "RemoveContainer" containerID="a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.542291 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} err="failed to get container status \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": rpc error: code = NotFound desc = could not find container \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": container with ID starting with a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.542309 4919 scope.go:117] "RemoveContainer" containerID="9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.542707 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} err="failed to get container status \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": rpc error: code = NotFound desc = could not find container \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": container with ID starting with 9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.542729 4919 scope.go:117] "RemoveContainer" containerID="55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.543139 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} err="failed to get container status \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": rpc error: code = NotFound desc = could not find container \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": container with ID starting with 55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.543192 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.543596 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} err="failed to get container status \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": rpc error: code = NotFound desc = could not find container \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": container with ID starting with 8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.543628 4919 scope.go:117] "RemoveContainer" containerID="81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.543963 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} err="failed to get container status \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": rpc error: code = NotFound desc = could not find container \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": container with ID starting with 81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.543984 4919 scope.go:117] "RemoveContainer" containerID="4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.544337 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} err="failed to get container status \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": rpc error: code = NotFound desc = could not find container \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": container with ID starting with 4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.544366 4919 scope.go:117] "RemoveContainer" containerID="648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.544718 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} err="failed to get container status \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": rpc error: code = NotFound desc = could not find container \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": container with ID starting with 648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.544739 4919 scope.go:117] "RemoveContainer" containerID="c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.544996 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} err="failed to get container status \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": rpc error: code = NotFound desc = could not find container \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": container with ID starting with c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.545024 4919 scope.go:117] "RemoveContainer" containerID="ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.546515 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} err="failed to get container status \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": rpc error: code = NotFound desc = could not find container \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": container with ID starting with ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.546546 4919 scope.go:117] "RemoveContainer" containerID="06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.546908 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} err="failed to get container status \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": rpc error: code = NotFound desc = could not find container \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": container with ID starting with 06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.546933 4919 scope.go:117] "RemoveContainer" containerID="a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.547201 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} err="failed to get container status \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": rpc error: code = NotFound desc = could not find container \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": container with ID starting with a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.547234 4919 scope.go:117] "RemoveContainer" containerID="9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.547609 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} err="failed to get container status \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": rpc error: code = NotFound desc = could not find container \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": container with ID starting with 9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.547627 4919 scope.go:117] "RemoveContainer" containerID="55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.547878 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23"} err="failed to get container status \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": rpc error: code = NotFound desc = could not find container \"55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23\": container with ID starting with 55060f2f273513318e33f5f284462012a35e322decd9cdaeeea0602acc036b23 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.547902 4919 scope.go:117] "RemoveContainer" containerID="8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.548132 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880"} err="failed to get container status \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": rpc error: code = NotFound desc = could not find container \"8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880\": container with ID starting with 8424a944ffb95aa4e069024df52cf69f2381dc0498735572e7cf94519fe6d880 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.548152 4919 scope.go:117] "RemoveContainer" containerID="81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.548480 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325"} err="failed to get container status \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": rpc error: code = NotFound desc = could not find container \"81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325\": container with ID starting with 81583105305b81fde3bde968fbbd5463ad1100d8f444b22f62764c4676b95325 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.548497 4919 scope.go:117] "RemoveContainer" containerID="4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.548826 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26"} err="failed to get container status \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": rpc error: code = NotFound desc = could not find container \"4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26\": container with ID starting with 4dd430eed1c68d77dcea63fe5885eec7aae3fee6a0008bd9e145b93aef04aa26 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.548844 4919 scope.go:117] "RemoveContainer" containerID="648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.549080 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580"} err="failed to get container status \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": rpc error: code = NotFound desc = could not find container \"648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580\": container with ID starting with 648a881496440d2732bd49969e863fcde223e95156c280e7699782a5521f1580 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.549111 4919 scope.go:117] "RemoveContainer" containerID="c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.549334 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736"} err="failed to get container status \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": rpc error: code = NotFound desc = could not find container \"c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736\": container with ID starting with c1a8f975d3caae031d32429c7fdf67356ff18bae1f0eefa3b99758d6c461a736 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.549353 4919 scope.go:117] "RemoveContainer" containerID="ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.549604 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0"} err="failed to get container status \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": rpc error: code = NotFound desc = could not find container \"ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0\": container with ID starting with ad5abb4f4c3ead1b05c1f3efbb6bb59fc71bcf5796e16a558ef7e16375571fb0 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.549628 4919 scope.go:117] "RemoveContainer" containerID="06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.549835 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91"} err="failed to get container status \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": rpc error: code = NotFound desc = could not find container \"06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91\": container with ID starting with 06e9ce48fc10caa34e8b774ce3039117386ffe276cf268d197ac46518a4baf91 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.549859 4919 scope.go:117] "RemoveContainer" containerID="a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.550057 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2"} err="failed to get container status \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": rpc error: code = NotFound desc = could not find container \"a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2\": container with ID starting with a9fa1c50eff4c6bb37ea60631a4278c6a15d6398d6fad6fb1a914bd6cccf15b2 not found: ID does not exist" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.550082 4919 scope.go:117] "RemoveContainer" containerID="9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493" Mar 10 22:04:27 crc kubenswrapper[4919]: I0310 22:04:27.550276 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493"} err="failed to get container status \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": rpc error: code = NotFound desc = could not find container \"9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493\": container with ID starting with 9ca4b1de4425e9aa53f9cf24a56759708297b223771278a445dcf171f4dd6493 not found: ID does not exist" Mar 10 22:04:28 crc kubenswrapper[4919]: I0310 22:04:28.248174 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbw8v_6a5db7c3-2a96-4030-8c88-5d82d325b62d/kube-multus/2.log" Mar 10 22:04:28 crc kubenswrapper[4919]: I0310 22:04:28.248653 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbw8v" event={"ID":"6a5db7c3-2a96-4030-8c88-5d82d325b62d","Type":"ContainerStarted","Data":"8fd64a04e76f7c5b7533250a6d82f9b738338bc1c58a06281073acf14a3f775b"} Mar 10 22:04:28 crc kubenswrapper[4919]: I0310 22:04:28.256925 4919 generic.go:334] "Generic (PLEG): container finished" podID="5e58dc1e-ca00-426e-ba2c-42b5f8184915" containerID="003568cec3cf1c8d66c6dbd27b1b300a11ba289c72e8d26880674b04ea44ca9f" exitCode=0 Mar 10 22:04:28 crc kubenswrapper[4919]: I0310 22:04:28.257006 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerDied","Data":"003568cec3cf1c8d66c6dbd27b1b300a11ba289c72e8d26880674b04ea44ca9f"} Mar 10 22:04:28 crc kubenswrapper[4919]: I0310 22:04:28.257054 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"ee729cb2d3ebb57e46df0f9bb8a6cb5a49b93d7ac62ccbd556bb8bc78de703fe"} Mar 10 22:04:29 crc kubenswrapper[4919]: I0310 22:04:29.266511 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"bbc70f7ef1771ad3f2e95217ec30bf66a62b85183328fd41f0c4dc1d7f3b602a"} Mar 10 22:04:29 crc kubenswrapper[4919]: I0310 22:04:29.267071 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"82d9308f16c06577cd10999541b812be60581f6c97a5b691f3f07fceb161c0e9"} Mar 10 22:04:29 crc kubenswrapper[4919]: I0310 22:04:29.267084 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"f136166289a42ea376027f895bca1a8d47b2147657a465a1a261571f6e07e1ee"} Mar 10 22:04:29 crc kubenswrapper[4919]: I0310 22:04:29.267095 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"8e8f30112d04c5523f667f613bc0310f191dae6e21d4ad4eaae2a7fceb6b0ea1"} Mar 10 22:04:29 crc kubenswrapper[4919]: I0310 22:04:29.267105 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"382423cb5cb383661cf5ce13826b82571b29d645a2d38953fe70e5d11404598f"} Mar 10 22:04:29 crc kubenswrapper[4919]: I0310 22:04:29.267116 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"fc9de2f394a83b742fa011d67e3aee65cd3b216ed3c45c029c8d5883e98b31c4"} Mar 10 22:04:31 crc kubenswrapper[4919]: I0310 22:04:31.293860 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"6643f3b2d1666a81e91844eece95ec3232ed909486c96a693a421061a005d235"} Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.446892 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-k4lxh"] Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.448814 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.451351 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.451601 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.451862 4919 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-st8sh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.452192 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.576366 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qxn\" (UniqueName: \"kubernetes.io/projected/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-kube-api-access-l7qxn\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.576447 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-node-mnt\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.576655 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-crc-storage\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.678047 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qxn\" (UniqueName: \"kubernetes.io/projected/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-kube-api-access-l7qxn\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.678339 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-node-mnt\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.678470 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-crc-storage\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.678804 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-node-mnt\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.679369 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-crc-storage\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.702206 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qxn\" (UniqueName: \"kubernetes.io/projected/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-kube-api-access-l7qxn\") pod \"crc-storage-crc-k4lxh\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: I0310 22:04:33.775666 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: E0310 22:04:33.808675 4919 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-k4lxh_crc-storage_31e35b2a-3e5e-44a4-a9e4-fbe83610e813_0(b432c6d23b5a01a18a2e486873c32f4e1d57d01613f30031bd7806b8812bfd2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 22:04:33 crc kubenswrapper[4919]: E0310 22:04:33.808745 4919 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-k4lxh_crc-storage_31e35b2a-3e5e-44a4-a9e4-fbe83610e813_0(b432c6d23b5a01a18a2e486873c32f4e1d57d01613f30031bd7806b8812bfd2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: E0310 22:04:33.808770 4919 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-k4lxh_crc-storage_31e35b2a-3e5e-44a4-a9e4-fbe83610e813_0(b432c6d23b5a01a18a2e486873c32f4e1d57d01613f30031bd7806b8812bfd2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:33 crc kubenswrapper[4919]: E0310 22:04:33.808828 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-k4lxh_crc-storage(31e35b2a-3e5e-44a4-a9e4-fbe83610e813)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-k4lxh_crc-storage(31e35b2a-3e5e-44a4-a9e4-fbe83610e813)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-k4lxh_crc-storage_31e35b2a-3e5e-44a4-a9e4-fbe83610e813_0(b432c6d23b5a01a18a2e486873c32f4e1d57d01613f30031bd7806b8812bfd2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-k4lxh" podUID="31e35b2a-3e5e-44a4-a9e4-fbe83610e813" Mar 10 22:04:34 crc kubenswrapper[4919]: I0310 22:04:34.318182 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" event={"ID":"5e58dc1e-ca00-426e-ba2c-42b5f8184915","Type":"ContainerStarted","Data":"64b19c3cfc7137f5fa908a0246efb3a6fed79822f231998462be790d0c14f68c"} Mar 10 22:04:34 crc kubenswrapper[4919]: I0310 22:04:34.318707 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:34 crc kubenswrapper[4919]: I0310 22:04:34.318744 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:34 crc kubenswrapper[4919]: I0310 22:04:34.346546 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" podStartSLOduration=8.346525246 podStartE2EDuration="8.346525246s" podCreationTimestamp="2026-03-10 22:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:04:34.342823907 +0000 UTC m=+861.584704545" watchObservedRunningTime="2026-03-10 22:04:34.346525246 +0000 UTC m=+861.588405864" Mar 10 22:04:34 crc kubenswrapper[4919]: I0310 22:04:34.351420 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:34 crc kubenswrapper[4919]: I0310 22:04:34.667804 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-k4lxh"] Mar 10 22:04:34 crc kubenswrapper[4919]: I0310 22:04:34.667936 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:34 crc kubenswrapper[4919]: I0310 22:04:34.668313 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:34 crc kubenswrapper[4919]: E0310 22:04:34.694888 4919 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-k4lxh_crc-storage_31e35b2a-3e5e-44a4-a9e4-fbe83610e813_0(1c3cbcec17c8e05a72ad5290b7201dd9efbb5233a7e277cfb21bd160517039e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 22:04:34 crc kubenswrapper[4919]: E0310 22:04:34.694959 4919 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-k4lxh_crc-storage_31e35b2a-3e5e-44a4-a9e4-fbe83610e813_0(1c3cbcec17c8e05a72ad5290b7201dd9efbb5233a7e277cfb21bd160517039e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:34 crc kubenswrapper[4919]: E0310 22:04:34.694985 4919 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-k4lxh_crc-storage_31e35b2a-3e5e-44a4-a9e4-fbe83610e813_0(1c3cbcec17c8e05a72ad5290b7201dd9efbb5233a7e277cfb21bd160517039e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:34 crc kubenswrapper[4919]: E0310 22:04:34.695042 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-k4lxh_crc-storage(31e35b2a-3e5e-44a4-a9e4-fbe83610e813)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-k4lxh_crc-storage(31e35b2a-3e5e-44a4-a9e4-fbe83610e813)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-k4lxh_crc-storage_31e35b2a-3e5e-44a4-a9e4-fbe83610e813_0(1c3cbcec17c8e05a72ad5290b7201dd9efbb5233a7e277cfb21bd160517039e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-k4lxh" podUID="31e35b2a-3e5e-44a4-a9e4-fbe83610e813" Mar 10 22:04:35 crc kubenswrapper[4919]: I0310 22:04:35.324543 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:35 crc kubenswrapper[4919]: I0310 22:04:35.349470 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:04:49 crc kubenswrapper[4919]: I0310 22:04:49.479061 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:49 crc kubenswrapper[4919]: I0310 22:04:49.480161 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:49 crc kubenswrapper[4919]: W0310 22:04:49.910017 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e35b2a_3e5e_44a4_a9e4_fbe83610e813.slice/crio-8d8e6e42af5dbf3db15d36b4709fdc780d22d2299cedd803badfd8f8e52985ca WatchSource:0}: Error finding container 8d8e6e42af5dbf3db15d36b4709fdc780d22d2299cedd803badfd8f8e52985ca: Status 404 returned error can't find the container with id 8d8e6e42af5dbf3db15d36b4709fdc780d22d2299cedd803badfd8f8e52985ca Mar 10 22:04:49 crc kubenswrapper[4919]: I0310 22:04:49.914740 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-k4lxh"] Mar 10 22:04:50 crc kubenswrapper[4919]: I0310 22:04:50.428652 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k4lxh" event={"ID":"31e35b2a-3e5e-44a4-a9e4-fbe83610e813","Type":"ContainerStarted","Data":"8d8e6e42af5dbf3db15d36b4709fdc780d22d2299cedd803badfd8f8e52985ca"} Mar 10 22:04:52 crc kubenswrapper[4919]: I0310 22:04:52.441767 4919 generic.go:334] "Generic (PLEG): container finished" podID="31e35b2a-3e5e-44a4-a9e4-fbe83610e813" containerID="de91a2f2adf04a60bdb3422734b5a3bd8eb19086cb6b976c1920bf52ac6f84c4" exitCode=0 Mar 10 22:04:52 crc kubenswrapper[4919]: I0310 22:04:52.441862 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k4lxh" event={"ID":"31e35b2a-3e5e-44a4-a9e4-fbe83610e813","Type":"ContainerDied","Data":"de91a2f2adf04a60bdb3422734b5a3bd8eb19086cb6b976c1920bf52ac6f84c4"} Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.693298 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.765134 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-node-mnt\") pod \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.765238 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-crc-storage\") pod \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.765270 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "31e35b2a-3e5e-44a4-a9e4-fbe83610e813" (UID: "31e35b2a-3e5e-44a4-a9e4-fbe83610e813"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.765293 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qxn\" (UniqueName: \"kubernetes.io/projected/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-kube-api-access-l7qxn\") pod \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\" (UID: \"31e35b2a-3e5e-44a4-a9e4-fbe83610e813\") " Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.765710 4919 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.770519 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-kube-api-access-l7qxn" (OuterVolumeSpecName: "kube-api-access-l7qxn") pod "31e35b2a-3e5e-44a4-a9e4-fbe83610e813" (UID: "31e35b2a-3e5e-44a4-a9e4-fbe83610e813"). InnerVolumeSpecName "kube-api-access-l7qxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.779062 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "31e35b2a-3e5e-44a4-a9e4-fbe83610e813" (UID: "31e35b2a-3e5e-44a4-a9e4-fbe83610e813"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.866214 4919 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:53 crc kubenswrapper[4919]: I0310 22:04:53.866245 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qxn\" (UniqueName: \"kubernetes.io/projected/31e35b2a-3e5e-44a4-a9e4-fbe83610e813-kube-api-access-l7qxn\") on node \"crc\" DevicePath \"\"" Mar 10 22:04:54 crc kubenswrapper[4919]: I0310 22:04:54.458477 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k4lxh" event={"ID":"31e35b2a-3e5e-44a4-a9e4-fbe83610e813","Type":"ContainerDied","Data":"8d8e6e42af5dbf3db15d36b4709fdc780d22d2299cedd803badfd8f8e52985ca"} Mar 10 22:04:54 crc kubenswrapper[4919]: I0310 22:04:54.458524 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d8e6e42af5dbf3db15d36b4709fdc780d22d2299cedd803badfd8f8e52985ca" Mar 10 22:04:54 crc kubenswrapper[4919]: I0310 22:04:54.458586 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k4lxh" Mar 10 22:04:57 crc kubenswrapper[4919]: I0310 22:04:57.276603 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bf975" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.670461 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5"] Mar 10 22:05:02 crc kubenswrapper[4919]: E0310 22:05:02.671294 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e35b2a-3e5e-44a4-a9e4-fbe83610e813" containerName="storage" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.671314 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e35b2a-3e5e-44a4-a9e4-fbe83610e813" containerName="storage" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.671501 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e35b2a-3e5e-44a4-a9e4-fbe83610e813" containerName="storage" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.672614 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.674573 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.685768 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5"] Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.786715 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.786815 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcpbz\" (UniqueName: \"kubernetes.io/projected/0bc29c7c-1201-487e-8c9a-5d802dde51a5-kube-api-access-jcpbz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.786859 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.887697 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcpbz\" (UniqueName: \"kubernetes.io/projected/0bc29c7c-1201-487e-8c9a-5d802dde51a5-kube-api-access-jcpbz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.887759 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.887795 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.888214 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.888425 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.923731 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcpbz\" (UniqueName: \"kubernetes.io/projected/0bc29c7c-1201-487e-8c9a-5d802dde51a5-kube-api-access-jcpbz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:02 crc kubenswrapper[4919]: I0310 22:05:02.999072 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:03 crc kubenswrapper[4919]: I0310 22:05:03.407939 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5"] Mar 10 22:05:03 crc kubenswrapper[4919]: I0310 22:05:03.513196 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" event={"ID":"0bc29c7c-1201-487e-8c9a-5d802dde51a5","Type":"ContainerStarted","Data":"cc4526bd63fc0600e850d3767611a66d6d115b5ff03d9c01a63fe492eb8684bd"} Mar 10 22:05:04 crc kubenswrapper[4919]: I0310 22:05:04.530462 4919 generic.go:334] "Generic (PLEG): container finished" podID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerID="6f26516ba29f953ce0d85dbfab628ae6d9630b32f6be2fc8c4f55ef3f0237bf3" exitCode=0 Mar 10 22:05:04 crc kubenswrapper[4919]: I0310 22:05:04.530604 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" event={"ID":"0bc29c7c-1201-487e-8c9a-5d802dde51a5","Type":"ContainerDied","Data":"6f26516ba29f953ce0d85dbfab628ae6d9630b32f6be2fc8c4f55ef3f0237bf3"} Mar 10 22:05:04 crc kubenswrapper[4919]: I0310 22:05:04.532595 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.027829 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6z4j5"] Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.028787 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.043143 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z4j5"] Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.129600 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-catalog-content\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.129675 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-utilities\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.129697 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g7rx\" (UniqueName: \"kubernetes.io/projected/5067da25-e1cc-425b-b62c-9f03c4cc37fb-kube-api-access-7g7rx\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.230380 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-catalog-content\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.230463 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-utilities\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.230484 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g7rx\" (UniqueName: \"kubernetes.io/projected/5067da25-e1cc-425b-b62c-9f03c4cc37fb-kube-api-access-7g7rx\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.230862 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-catalog-content\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.230932 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-utilities\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.249240 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g7rx\" (UniqueName: \"kubernetes.io/projected/5067da25-e1cc-425b-b62c-9f03c4cc37fb-kube-api-access-7g7rx\") pod \"redhat-operators-6z4j5\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.346734 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:05 crc kubenswrapper[4919]: I0310 22:05:05.754314 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z4j5"] Mar 10 22:05:05 crc kubenswrapper[4919]: W0310 22:05:05.762833 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5067da25_e1cc_425b_b62c_9f03c4cc37fb.slice/crio-ca069e38619a3a150f3b3cc88fd9739a58cef4e25fa37d584aeed5a3c69ccebc WatchSource:0}: Error finding container ca069e38619a3a150f3b3cc88fd9739a58cef4e25fa37d584aeed5a3c69ccebc: Status 404 returned error can't find the container with id ca069e38619a3a150f3b3cc88fd9739a58cef4e25fa37d584aeed5a3c69ccebc Mar 10 22:05:06 crc kubenswrapper[4919]: I0310 22:05:06.546562 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" event={"ID":"0bc29c7c-1201-487e-8c9a-5d802dde51a5","Type":"ContainerStarted","Data":"2a21813f48ea2ad3717f8e4a3c845c7fb6227bb1a549f3fd3ca6e0223df2db9e"} Mar 10 22:05:06 crc kubenswrapper[4919]: I0310 22:05:06.551068 4919 generic.go:334] "Generic (PLEG): container finished" podID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerID="adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22" exitCode=0 Mar 10 22:05:06 crc kubenswrapper[4919]: I0310 22:05:06.551115 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z4j5" event={"ID":"5067da25-e1cc-425b-b62c-9f03c4cc37fb","Type":"ContainerDied","Data":"adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22"} Mar 10 22:05:06 crc kubenswrapper[4919]: I0310 22:05:06.551144 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z4j5" event={"ID":"5067da25-e1cc-425b-b62c-9f03c4cc37fb","Type":"ContainerStarted","Data":"ca069e38619a3a150f3b3cc88fd9739a58cef4e25fa37d584aeed5a3c69ccebc"} Mar 10 22:05:07 crc kubenswrapper[4919]: I0310 22:05:07.559518 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z4j5" event={"ID":"5067da25-e1cc-425b-b62c-9f03c4cc37fb","Type":"ContainerStarted","Data":"304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522"} Mar 10 22:05:07 crc kubenswrapper[4919]: I0310 22:05:07.563164 4919 generic.go:334] "Generic (PLEG): container finished" podID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerID="2a21813f48ea2ad3717f8e4a3c845c7fb6227bb1a549f3fd3ca6e0223df2db9e" exitCode=0 Mar 10 22:05:07 crc kubenswrapper[4919]: I0310 22:05:07.563218 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" event={"ID":"0bc29c7c-1201-487e-8c9a-5d802dde51a5","Type":"ContainerDied","Data":"2a21813f48ea2ad3717f8e4a3c845c7fb6227bb1a549f3fd3ca6e0223df2db9e"} Mar 10 22:05:08 crc kubenswrapper[4919]: I0310 22:05:08.571273 4919 generic.go:334] "Generic (PLEG): container finished" podID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerID="020c418dbc60cb0c8e22f4fe12ecdc68fba2aa8af8867f8f9f5896701a224eb5" exitCode=0 Mar 10 22:05:08 crc kubenswrapper[4919]: I0310 22:05:08.571377 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" event={"ID":"0bc29c7c-1201-487e-8c9a-5d802dde51a5","Type":"ContainerDied","Data":"020c418dbc60cb0c8e22f4fe12ecdc68fba2aa8af8867f8f9f5896701a224eb5"} Mar 10 22:05:08 crc kubenswrapper[4919]: I0310 22:05:08.572962 4919 generic.go:334] "Generic (PLEG): container finished" podID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerID="304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522" exitCode=0 Mar 10 22:05:08 crc kubenswrapper[4919]: I0310 22:05:08.572992 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z4j5" event={"ID":"5067da25-e1cc-425b-b62c-9f03c4cc37fb","Type":"ContainerDied","Data":"304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522"} Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.580523 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z4j5" event={"ID":"5067da25-e1cc-425b-b62c-9f03c4cc37fb","Type":"ContainerStarted","Data":"7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99"} Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.606901 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6z4j5" podStartSLOduration=2.132790547 podStartE2EDuration="4.606885219s" podCreationTimestamp="2026-03-10 22:05:05 +0000 UTC" firstStartedPulling="2026-03-10 22:05:06.552876306 +0000 UTC m=+893.794756934" lastFinishedPulling="2026-03-10 22:05:09.026970998 +0000 UTC m=+896.268851606" observedRunningTime="2026-03-10 22:05:09.604751301 +0000 UTC m=+896.846631919" watchObservedRunningTime="2026-03-10 22:05:09.606885219 +0000 UTC m=+896.848765827" Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.848376 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.919176 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-bundle\") pod \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.919313 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcpbz\" (UniqueName: \"kubernetes.io/projected/0bc29c7c-1201-487e-8c9a-5d802dde51a5-kube-api-access-jcpbz\") pod \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.919340 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-util\") pod \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\" (UID: \"0bc29c7c-1201-487e-8c9a-5d802dde51a5\") " Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.920016 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-bundle" (OuterVolumeSpecName: "bundle") pod "0bc29c7c-1201-487e-8c9a-5d802dde51a5" (UID: "0bc29c7c-1201-487e-8c9a-5d802dde51a5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.927223 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc29c7c-1201-487e-8c9a-5d802dde51a5-kube-api-access-jcpbz" (OuterVolumeSpecName: "kube-api-access-jcpbz") pod "0bc29c7c-1201-487e-8c9a-5d802dde51a5" (UID: "0bc29c7c-1201-487e-8c9a-5d802dde51a5"). InnerVolumeSpecName "kube-api-access-jcpbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:05:09 crc kubenswrapper[4919]: I0310 22:05:09.934511 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-util" (OuterVolumeSpecName: "util") pod "0bc29c7c-1201-487e-8c9a-5d802dde51a5" (UID: "0bc29c7c-1201-487e-8c9a-5d802dde51a5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:05:10 crc kubenswrapper[4919]: I0310 22:05:10.020668 4919 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:10 crc kubenswrapper[4919]: I0310 22:05:10.020701 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcpbz\" (UniqueName: \"kubernetes.io/projected/0bc29c7c-1201-487e-8c9a-5d802dde51a5-kube-api-access-jcpbz\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:10 crc kubenswrapper[4919]: I0310 22:05:10.020710 4919 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0bc29c7c-1201-487e-8c9a-5d802dde51a5-util\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:10 crc kubenswrapper[4919]: I0310 22:05:10.592852 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" Mar 10 22:05:10 crc kubenswrapper[4919]: I0310 22:05:10.592850 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5" event={"ID":"0bc29c7c-1201-487e-8c9a-5d802dde51a5","Type":"ContainerDied","Data":"cc4526bd63fc0600e850d3767611a66d6d115b5ff03d9c01a63fe492eb8684bd"} Mar 10 22:05:10 crc kubenswrapper[4919]: I0310 22:05:10.593048 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc4526bd63fc0600e850d3767611a66d6d115b5ff03d9c01a63fe492eb8684bd" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.943194 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr"] Mar 10 22:05:12 crc kubenswrapper[4919]: E0310 22:05:12.943770 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerName="util" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.943788 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerName="util" Mar 10 22:05:12 crc kubenswrapper[4919]: E0310 22:05:12.943802 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerName="pull" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.943809 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerName="pull" Mar 10 22:05:12 crc kubenswrapper[4919]: E0310 22:05:12.943822 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerName="extract" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.943829 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerName="extract" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.943966 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc29c7c-1201-487e-8c9a-5d802dde51a5" containerName="extract" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.944427 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.946008 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.946382 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-zqfpd" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.946605 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 22:05:12 crc kubenswrapper[4919]: I0310 22:05:12.953491 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr"] Mar 10 22:05:13 crc kubenswrapper[4919]: I0310 22:05:13.058921 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvttd\" (UniqueName: \"kubernetes.io/projected/7ff7f3f6-029a-4a7e-818c-658741a6afe9-kube-api-access-lvttd\") pod \"nmstate-operator-75c5dccd6c-pgjbr\" (UID: \"7ff7f3f6-029a-4a7e-818c-658741a6afe9\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr" Mar 10 22:05:13 crc kubenswrapper[4919]: I0310 22:05:13.160683 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvttd\" (UniqueName: \"kubernetes.io/projected/7ff7f3f6-029a-4a7e-818c-658741a6afe9-kube-api-access-lvttd\") pod \"nmstate-operator-75c5dccd6c-pgjbr\" (UID: \"7ff7f3f6-029a-4a7e-818c-658741a6afe9\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr" Mar 10 22:05:13 crc kubenswrapper[4919]: I0310 22:05:13.181543 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvttd\" (UniqueName: \"kubernetes.io/projected/7ff7f3f6-029a-4a7e-818c-658741a6afe9-kube-api-access-lvttd\") pod \"nmstate-operator-75c5dccd6c-pgjbr\" (UID: \"7ff7f3f6-029a-4a7e-818c-658741a6afe9\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr" Mar 10 22:05:13 crc kubenswrapper[4919]: I0310 22:05:13.261975 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr" Mar 10 22:05:13 crc kubenswrapper[4919]: I0310 22:05:13.474036 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr"] Mar 10 22:05:13 crc kubenswrapper[4919]: I0310 22:05:13.607957 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr" event={"ID":"7ff7f3f6-029a-4a7e-818c-658741a6afe9","Type":"ContainerStarted","Data":"e4f49c8457fad9bde6450eb123e1a703ec476539633b946f92eb2b32208b6322"} Mar 10 22:05:15 crc kubenswrapper[4919]: I0310 22:05:15.347059 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:15 crc kubenswrapper[4919]: I0310 22:05:15.347378 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:16 crc kubenswrapper[4919]: I0310 22:05:16.393984 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z4j5" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="registry-server" probeResult="failure" output=< Mar 10 22:05:16 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 22:05:16 crc kubenswrapper[4919]: > Mar 10 22:05:16 crc kubenswrapper[4919]: I0310 22:05:16.626497 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr" event={"ID":"7ff7f3f6-029a-4a7e-818c-658741a6afe9","Type":"ContainerStarted","Data":"06ba8bd4b903cf41de042e1ac1d000540c49a091228cd7402744be1d00b80472"} Mar 10 22:05:16 crc kubenswrapper[4919]: I0310 22:05:16.643823 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-pgjbr" podStartSLOduration=2.373744264 podStartE2EDuration="4.643801962s" podCreationTimestamp="2026-03-10 22:05:12 +0000 UTC" firstStartedPulling="2026-03-10 22:05:13.489762812 +0000 UTC m=+900.731643420" lastFinishedPulling="2026-03-10 22:05:15.75982051 +0000 UTC m=+903.001701118" observedRunningTime="2026-03-10 22:05:16.642455416 +0000 UTC m=+903.884336074" watchObservedRunningTime="2026-03-10 22:05:16.643801962 +0000 UTC m=+903.885682590" Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.825588 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-h5rdm"] Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.827375 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.833090 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-dcqbw" Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.838632 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz"] Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.839677 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.841792 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.847622 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-h5rdm"] Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.855360 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz"] Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.904067 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-whpkz"] Mar 10 22:05:22 crc kubenswrapper[4919]: I0310 22:05:22.904674 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.001174 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-dbus-socket\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.001221 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sdwz\" (UniqueName: \"kubernetes.io/projected/556dc7e4-4c98-4f59-8e60-a7997f766706-kube-api-access-6sdwz\") pod \"nmstate-metrics-69594cc75-h5rdm\" (UID: \"556dc7e4-4c98-4f59-8e60-a7997f766706\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.001240 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff569aa2-f933-44e4-bd70-ba0ff19efd02-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-n9hfz\" (UID: \"ff569aa2-f933-44e4-bd70-ba0ff19efd02\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.001423 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-ovs-socket\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.001509 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-nmstate-lock\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.001583 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zjvs\" (UniqueName: \"kubernetes.io/projected/d7074097-fdce-4f8e-b343-2c1196f997a2-kube-api-access-6zjvs\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.001699 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47sdq\" (UniqueName: \"kubernetes.io/projected/ff569aa2-f933-44e4-bd70-ba0ff19efd02-kube-api-access-47sdq\") pod \"nmstate-webhook-786f45cff4-n9hfz\" (UID: \"ff569aa2-f933-44e4-bd70-ba0ff19efd02\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.014083 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr"] Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.014683 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.016479 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.016746 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8xhck" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.017285 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.024373 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr"] Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103035 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sdwz\" (UniqueName: \"kubernetes.io/projected/556dc7e4-4c98-4f59-8e60-a7997f766706-kube-api-access-6sdwz\") pod \"nmstate-metrics-69594cc75-h5rdm\" (UID: \"556dc7e4-4c98-4f59-8e60-a7997f766706\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103110 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff569aa2-f933-44e4-bd70-ba0ff19efd02-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-n9hfz\" (UID: \"ff569aa2-f933-44e4-bd70-ba0ff19efd02\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103162 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv74n\" (UniqueName: \"kubernetes.io/projected/357040e9-d683-4de3-bf54-c414218b1705-kube-api-access-hv74n\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103206 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-ovs-socket\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103236 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-nmstate-lock\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103282 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zjvs\" (UniqueName: \"kubernetes.io/projected/d7074097-fdce-4f8e-b343-2c1196f997a2-kube-api-access-6zjvs\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103336 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-ovs-socket\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103361 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-nmstate-lock\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: E0310 22:05:23.103372 4919 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103352 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/357040e9-d683-4de3-bf54-c414218b1705-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: E0310 22:05:23.103498 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff569aa2-f933-44e4-bd70-ba0ff19efd02-tls-key-pair podName:ff569aa2-f933-44e4-bd70-ba0ff19efd02 nodeName:}" failed. No retries permitted until 2026-03-10 22:05:23.603476849 +0000 UTC m=+910.845357457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/ff569aa2-f933-44e4-bd70-ba0ff19efd02-tls-key-pair") pod "nmstate-webhook-786f45cff4-n9hfz" (UID: "ff569aa2-f933-44e4-bd70-ba0ff19efd02") : secret "openshift-nmstate-webhook" not found Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103526 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/357040e9-d683-4de3-bf54-c414218b1705-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103549 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47sdq\" (UniqueName: \"kubernetes.io/projected/ff569aa2-f933-44e4-bd70-ba0ff19efd02-kube-api-access-47sdq\") pod \"nmstate-webhook-786f45cff4-n9hfz\" (UID: \"ff569aa2-f933-44e4-bd70-ba0ff19efd02\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.103590 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-dbus-socket\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.104057 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d7074097-fdce-4f8e-b343-2c1196f997a2-dbus-socket\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.134113 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zjvs\" (UniqueName: \"kubernetes.io/projected/d7074097-fdce-4f8e-b343-2c1196f997a2-kube-api-access-6zjvs\") pod \"nmstate-handler-whpkz\" (UID: \"d7074097-fdce-4f8e-b343-2c1196f997a2\") " pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.136299 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sdwz\" (UniqueName: \"kubernetes.io/projected/556dc7e4-4c98-4f59-8e60-a7997f766706-kube-api-access-6sdwz\") pod \"nmstate-metrics-69594cc75-h5rdm\" (UID: \"556dc7e4-4c98-4f59-8e60-a7997f766706\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.138168 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47sdq\" (UniqueName: \"kubernetes.io/projected/ff569aa2-f933-44e4-bd70-ba0ff19efd02-kube-api-access-47sdq\") pod \"nmstate-webhook-786f45cff4-n9hfz\" (UID: \"ff569aa2-f933-44e4-bd70-ba0ff19efd02\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.192905 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.204945 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv74n\" (UniqueName: \"kubernetes.io/projected/357040e9-d683-4de3-bf54-c414218b1705-kube-api-access-hv74n\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.205305 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/357040e9-d683-4de3-bf54-c414218b1705-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.205347 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/357040e9-d683-4de3-bf54-c414218b1705-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: E0310 22:05:23.205518 4919 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 10 22:05:23 crc kubenswrapper[4919]: E0310 22:05:23.205587 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/357040e9-d683-4de3-bf54-c414218b1705-plugin-serving-cert podName:357040e9-d683-4de3-bf54-c414218b1705 nodeName:}" failed. No retries permitted until 2026-03-10 22:05:23.705569133 +0000 UTC m=+910.947449751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/357040e9-d683-4de3-bf54-c414218b1705-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-wv4hr" (UID: "357040e9-d683-4de3-bf54-c414218b1705") : secret "plugin-serving-cert" not found Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.206132 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/357040e9-d683-4de3-bf54-c414218b1705-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.215238 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d7868ff7-zzgt9"] Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.216023 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.232102 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.238854 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv74n\" (UniqueName: \"kubernetes.io/projected/357040e9-d683-4de3-bf54-c414218b1705-kube-api-access-hv74n\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.242920 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d7868ff7-zzgt9"] Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.306198 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-trusted-ca-bundle\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.306279 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-service-ca\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.306305 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-oauth-serving-cert\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.306321 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfwjc\" (UniqueName: \"kubernetes.io/projected/4ae25316-c3f6-47a2-b891-372d6c7117a3-kube-api-access-wfwjc\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.306380 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-oauth-config\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.306443 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-serving-cert\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.306463 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-config\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.407377 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-serving-cert\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.407426 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-config\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.407468 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-trusted-ca-bundle\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.407502 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-service-ca\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.407531 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-oauth-serving-cert\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.407545 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfwjc\" (UniqueName: \"kubernetes.io/projected/4ae25316-c3f6-47a2-b891-372d6c7117a3-kube-api-access-wfwjc\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.407585 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-oauth-config\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.408672 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-service-ca\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.408701 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-trusted-ca-bundle\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.408716 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-oauth-serving-cert\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.410122 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-config\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.411872 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-oauth-config\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.412873 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ae25316-c3f6-47a2-b891-372d6c7117a3-console-serving-cert\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.427474 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfwjc\" (UniqueName: \"kubernetes.io/projected/4ae25316-c3f6-47a2-b891-372d6c7117a3-kube-api-access-wfwjc\") pod \"console-5d7868ff7-zzgt9\" (UID: \"4ae25316-c3f6-47a2-b891-372d6c7117a3\") " pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.442370 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-h5rdm"] Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.571265 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.609176 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff569aa2-f933-44e4-bd70-ba0ff19efd02-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-n9hfz\" (UID: \"ff569aa2-f933-44e4-bd70-ba0ff19efd02\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.613054 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ff569aa2-f933-44e4-bd70-ba0ff19efd02-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-n9hfz\" (UID: \"ff569aa2-f933-44e4-bd70-ba0ff19efd02\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.674433 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" event={"ID":"556dc7e4-4c98-4f59-8e60-a7997f766706","Type":"ContainerStarted","Data":"a1ba6fa0174374d72e19d809883bf012ca4bd597c8ac9572708f7fce154731f2"} Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.675907 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-whpkz" event={"ID":"d7074097-fdce-4f8e-b343-2c1196f997a2","Type":"ContainerStarted","Data":"b78bead5fe65bdf3329f745f0af8ec8815a4adfda5c7a6057003614b7d1d5e46"} Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.710533 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/357040e9-d683-4de3-bf54-c414218b1705-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.714503 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/357040e9-d683-4de3-bf54-c414218b1705-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-wv4hr\" (UID: \"357040e9-d683-4de3-bf54-c414218b1705\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.802382 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:23 crc kubenswrapper[4919]: I0310 22:05:23.927197 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" Mar 10 22:05:24 crc kubenswrapper[4919]: I0310 22:05:24.002371 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d7868ff7-zzgt9"] Mar 10 22:05:24 crc kubenswrapper[4919]: W0310 22:05:24.019095 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae25316_c3f6_47a2_b891_372d6c7117a3.slice/crio-2e0d5955804b4ec373a030c8142670c9816413486ec046f3abccb453854d878e WatchSource:0}: Error finding container 2e0d5955804b4ec373a030c8142670c9816413486ec046f3abccb453854d878e: Status 404 returned error can't find the container with id 2e0d5955804b4ec373a030c8142670c9816413486ec046f3abccb453854d878e Mar 10 22:05:24 crc kubenswrapper[4919]: I0310 22:05:24.044853 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz"] Mar 10 22:05:24 crc kubenswrapper[4919]: W0310 22:05:24.049900 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff569aa2_f933_44e4_bd70_ba0ff19efd02.slice/crio-277e6bd258b8d14a4cd2553865e0700632be6144a8492e88636bba2628ed39e8 WatchSource:0}: Error finding container 277e6bd258b8d14a4cd2553865e0700632be6144a8492e88636bba2628ed39e8: Status 404 returned error can't find the container with id 277e6bd258b8d14a4cd2553865e0700632be6144a8492e88636bba2628ed39e8 Mar 10 22:05:24 crc kubenswrapper[4919]: I0310 22:05:24.120634 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr"] Mar 10 22:05:24 crc kubenswrapper[4919]: I0310 22:05:24.681951 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" event={"ID":"357040e9-d683-4de3-bf54-c414218b1705","Type":"ContainerStarted","Data":"45b008e2fd195eae58a61cc3313198768e27e166c0ffa61c64d816a631b9bb7d"} Mar 10 22:05:24 crc kubenswrapper[4919]: I0310 22:05:24.682793 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" event={"ID":"ff569aa2-f933-44e4-bd70-ba0ff19efd02","Type":"ContainerStarted","Data":"277e6bd258b8d14a4cd2553865e0700632be6144a8492e88636bba2628ed39e8"} Mar 10 22:05:24 crc kubenswrapper[4919]: I0310 22:05:24.684163 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d7868ff7-zzgt9" event={"ID":"4ae25316-c3f6-47a2-b891-372d6c7117a3","Type":"ContainerStarted","Data":"2a6bd2b3edbb2d84499ae533a430f13659a1bea0672b561faf9af7892955867b"} Mar 10 22:05:24 crc kubenswrapper[4919]: I0310 22:05:24.684188 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d7868ff7-zzgt9" event={"ID":"4ae25316-c3f6-47a2-b891-372d6c7117a3","Type":"ContainerStarted","Data":"2e0d5955804b4ec373a030c8142670c9816413486ec046f3abccb453854d878e"} Mar 10 22:05:24 crc kubenswrapper[4919]: I0310 22:05:24.698246 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d7868ff7-zzgt9" podStartSLOduration=1.698236034 podStartE2EDuration="1.698236034s" podCreationTimestamp="2026-03-10 22:05:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:05:24.696710803 +0000 UTC m=+911.938591411" watchObservedRunningTime="2026-03-10 22:05:24.698236034 +0000 UTC m=+911.940116642" Mar 10 22:05:25 crc kubenswrapper[4919]: I0310 22:05:25.387027 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:25 crc kubenswrapper[4919]: I0310 22:05:25.427552 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:25 crc kubenswrapper[4919]: I0310 22:05:25.613702 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z4j5"] Mar 10 22:05:25 crc kubenswrapper[4919]: I0310 22:05:25.692756 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" event={"ID":"556dc7e4-4c98-4f59-8e60-a7997f766706","Type":"ContainerStarted","Data":"f7cebef805e396cc33f81d9f08f73782b034b967fd8af756a27866a4d2692347"} Mar 10 22:05:25 crc kubenswrapper[4919]: I0310 22:05:25.694276 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" event={"ID":"ff569aa2-f933-44e4-bd70-ba0ff19efd02","Type":"ContainerStarted","Data":"3f8f48becf42f07e007cc5362d2dd88efd723efc26056ed965ed74a216b1ea72"} Mar 10 22:05:25 crc kubenswrapper[4919]: I0310 22:05:25.713530 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" podStartSLOduration=2.283002982 podStartE2EDuration="3.713510201s" podCreationTimestamp="2026-03-10 22:05:22 +0000 UTC" firstStartedPulling="2026-03-10 22:05:24.052532113 +0000 UTC m=+911.294412721" lastFinishedPulling="2026-03-10 22:05:25.483039302 +0000 UTC m=+912.724919940" observedRunningTime="2026-03-10 22:05:25.708938578 +0000 UTC m=+912.950819206" watchObservedRunningTime="2026-03-10 22:05:25.713510201 +0000 UTC m=+912.955390819" Mar 10 22:05:26 crc kubenswrapper[4919]: I0310 22:05:26.703455 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-whpkz" event={"ID":"d7074097-fdce-4f8e-b343-2c1196f997a2","Type":"ContainerStarted","Data":"e2ccbd61e9d1dff0ad8db254d9c61d5a5925938c2705b6e86d7fe46c2ac61112"} Mar 10 22:05:26 crc kubenswrapper[4919]: I0310 22:05:26.704247 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:26 crc kubenswrapper[4919]: I0310 22:05:26.707073 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" event={"ID":"357040e9-d683-4de3-bf54-c414218b1705","Type":"ContainerStarted","Data":"7cc7e79a0ed26eadeef372e5c9b7308f9ab193290f3009f73cc6bb376061cbea"} Mar 10 22:05:26 crc kubenswrapper[4919]: I0310 22:05:26.707280 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6z4j5" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="registry-server" containerID="cri-o://7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99" gracePeriod=2 Mar 10 22:05:26 crc kubenswrapper[4919]: I0310 22:05:26.707691 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:26 crc kubenswrapper[4919]: I0310 22:05:26.725534 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-whpkz" podStartSLOduration=2.501503837 podStartE2EDuration="4.72551304s" podCreationTimestamp="2026-03-10 22:05:22 +0000 UTC" firstStartedPulling="2026-03-10 22:05:23.256653505 +0000 UTC m=+910.498534123" lastFinishedPulling="2026-03-10 22:05:25.480662718 +0000 UTC m=+912.722543326" observedRunningTime="2026-03-10 22:05:26.719723363 +0000 UTC m=+913.961604011" watchObservedRunningTime="2026-03-10 22:05:26.72551304 +0000 UTC m=+913.967393668" Mar 10 22:05:26 crc kubenswrapper[4919]: I0310 22:05:26.738507 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-wv4hr" podStartSLOduration=1.531023057 podStartE2EDuration="3.738487171s" podCreationTimestamp="2026-03-10 22:05:23 +0000 UTC" firstStartedPulling="2026-03-10 22:05:24.131315436 +0000 UTC m=+911.373196044" lastFinishedPulling="2026-03-10 22:05:26.33877955 +0000 UTC m=+913.580660158" observedRunningTime="2026-03-10 22:05:26.736050015 +0000 UTC m=+913.977930623" watchObservedRunningTime="2026-03-10 22:05:26.738487171 +0000 UTC m=+913.980367789" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.097935 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.267252 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-catalog-content\") pod \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.267359 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g7rx\" (UniqueName: \"kubernetes.io/projected/5067da25-e1cc-425b-b62c-9f03c4cc37fb-kube-api-access-7g7rx\") pod \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.267497 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-utilities\") pod \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\" (UID: \"5067da25-e1cc-425b-b62c-9f03c4cc37fb\") " Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.268817 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-utilities" (OuterVolumeSpecName: "utilities") pod "5067da25-e1cc-425b-b62c-9f03c4cc37fb" (UID: "5067da25-e1cc-425b-b62c-9f03c4cc37fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.283483 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5067da25-e1cc-425b-b62c-9f03c4cc37fb-kube-api-access-7g7rx" (OuterVolumeSpecName: "kube-api-access-7g7rx") pod "5067da25-e1cc-425b-b62c-9f03c4cc37fb" (UID: "5067da25-e1cc-425b-b62c-9f03c4cc37fb"). InnerVolumeSpecName "kube-api-access-7g7rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.369675 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.369716 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g7rx\" (UniqueName: \"kubernetes.io/projected/5067da25-e1cc-425b-b62c-9f03c4cc37fb-kube-api-access-7g7rx\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.435030 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5067da25-e1cc-425b-b62c-9f03c4cc37fb" (UID: "5067da25-e1cc-425b-b62c-9f03c4cc37fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.471483 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5067da25-e1cc-425b-b62c-9f03c4cc37fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.716350 4919 generic.go:334] "Generic (PLEG): container finished" podID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerID="7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99" exitCode=0 Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.716500 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z4j5" event={"ID":"5067da25-e1cc-425b-b62c-9f03c4cc37fb","Type":"ContainerDied","Data":"7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99"} Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.716564 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z4j5" event={"ID":"5067da25-e1cc-425b-b62c-9f03c4cc37fb","Type":"ContainerDied","Data":"ca069e38619a3a150f3b3cc88fd9739a58cef4e25fa37d584aeed5a3c69ccebc"} Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.716519 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z4j5" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.716608 4919 scope.go:117] "RemoveContainer" containerID="7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.737616 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z4j5"] Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.739113 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6z4j5"] Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.742963 4919 scope.go:117] "RemoveContainer" containerID="304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.757323 4919 scope.go:117] "RemoveContainer" containerID="adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.775328 4919 scope.go:117] "RemoveContainer" containerID="7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99" Mar 10 22:05:27 crc kubenswrapper[4919]: E0310 22:05:27.775788 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99\": container with ID starting with 7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99 not found: ID does not exist" containerID="7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.775816 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99"} err="failed to get container status \"7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99\": rpc error: code = NotFound desc = could not find container \"7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99\": container with ID starting with 7e3a5541b4966d536e6fb5d8e0f1377a792fd63b01f9bdc58dbba3a0abc22b99 not found: ID does not exist" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.775834 4919 scope.go:117] "RemoveContainer" containerID="304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522" Mar 10 22:05:27 crc kubenswrapper[4919]: E0310 22:05:27.776115 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522\": container with ID starting with 304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522 not found: ID does not exist" containerID="304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.776163 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522"} err="failed to get container status \"304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522\": rpc error: code = NotFound desc = could not find container \"304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522\": container with ID starting with 304e4361432a873607b5af737d31477ec0acae5a46748ea5a1a81f102e989522 not found: ID does not exist" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.776194 4919 scope.go:117] "RemoveContainer" containerID="adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22" Mar 10 22:05:27 crc kubenswrapper[4919]: E0310 22:05:27.776951 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22\": container with ID starting with adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22 not found: ID does not exist" containerID="adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22" Mar 10 22:05:27 crc kubenswrapper[4919]: I0310 22:05:27.776974 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22"} err="failed to get container status \"adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22\": rpc error: code = NotFound desc = could not find container \"adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22\": container with ID starting with adea41f563b2129c7164006541146dd42623fcea0671b9317611df88f5185d22 not found: ID does not exist" Mar 10 22:05:28 crc kubenswrapper[4919]: I0310 22:05:28.746712 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" event={"ID":"556dc7e4-4c98-4f59-8e60-a7997f766706","Type":"ContainerStarted","Data":"9b3dbceb16cc02c88955e18576570a46c4f01fe2a252591bb6b0625f5f2b9293"} Mar 10 22:05:28 crc kubenswrapper[4919]: I0310 22:05:28.781930 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-h5rdm" podStartSLOduration=1.986395372 podStartE2EDuration="6.781795621s" podCreationTimestamp="2026-03-10 22:05:22 +0000 UTC" firstStartedPulling="2026-03-10 22:05:23.45183602 +0000 UTC m=+910.693716628" lastFinishedPulling="2026-03-10 22:05:28.247236279 +0000 UTC m=+915.489116877" observedRunningTime="2026-03-10 22:05:28.778907712 +0000 UTC m=+916.020788400" watchObservedRunningTime="2026-03-10 22:05:28.781795621 +0000 UTC m=+916.023676259" Mar 10 22:05:29 crc kubenswrapper[4919]: I0310 22:05:29.494566 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" path="/var/lib/kubelet/pods/5067da25-e1cc-425b-b62c-9f03c4cc37fb/volumes" Mar 10 22:05:33 crc kubenswrapper[4919]: I0310 22:05:33.269646 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-whpkz" Mar 10 22:05:33 crc kubenswrapper[4919]: I0310 22:05:33.572086 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:33 crc kubenswrapper[4919]: I0310 22:05:33.572167 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:33 crc kubenswrapper[4919]: I0310 22:05:33.580613 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:33 crc kubenswrapper[4919]: I0310 22:05:33.784819 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d7868ff7-zzgt9" Mar 10 22:05:33 crc kubenswrapper[4919]: I0310 22:05:33.861296 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-58nxf"] Mar 10 22:05:43 crc kubenswrapper[4919]: I0310 22:05:43.809246 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-n9hfz" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.223710 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv"] Mar 10 22:05:58 crc kubenswrapper[4919]: E0310 22:05:58.224489 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="extract-utilities" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.224505 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="extract-utilities" Mar 10 22:05:58 crc kubenswrapper[4919]: E0310 22:05:58.224527 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="registry-server" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.224534 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="registry-server" Mar 10 22:05:58 crc kubenswrapper[4919]: E0310 22:05:58.224549 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="extract-content" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.224556 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="extract-content" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.224681 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5067da25-e1cc-425b-b62c-9f03c4cc37fb" containerName="registry-server" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.225360 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.227676 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.236113 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv"] Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.418213 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.418321 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.418361 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxj9v\" (UniqueName: \"kubernetes.io/projected/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-kube-api-access-fxj9v\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.519772 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.519895 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.519946 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxj9v\" (UniqueName: \"kubernetes.io/projected/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-kube-api-access-fxj9v\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.520534 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.520659 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.549019 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxj9v\" (UniqueName: \"kubernetes.io/projected/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-kube-api-access-fxj9v\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.840558 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:05:58 crc kubenswrapper[4919]: I0310 22:05:58.906965 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-58nxf" podUID="9657873d-9275-4945-9e91-0b2c2844ae5d" containerName="console" containerID="cri-o://49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360" gracePeriod=15 Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.131867 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv"] Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.175820 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.175878 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.235734 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-58nxf_9657873d-9275-4945-9e91-0b2c2844ae5d/console/0.log" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.235804 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-58nxf" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332323 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8thhg\" (UniqueName: \"kubernetes.io/projected/9657873d-9275-4945-9e91-0b2c2844ae5d-kube-api-access-8thhg\") pod \"9657873d-9275-4945-9e91-0b2c2844ae5d\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332407 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-service-ca\") pod \"9657873d-9275-4945-9e91-0b2c2844ae5d\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332457 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-serving-cert\") pod \"9657873d-9275-4945-9e91-0b2c2844ae5d\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332513 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-oauth-serving-cert\") pod \"9657873d-9275-4945-9e91-0b2c2844ae5d\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332547 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-oauth-config\") pod \"9657873d-9275-4945-9e91-0b2c2844ae5d\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332625 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-trusted-ca-bundle\") pod \"9657873d-9275-4945-9e91-0b2c2844ae5d\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332656 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-console-config\") pod \"9657873d-9275-4945-9e91-0b2c2844ae5d\" (UID: \"9657873d-9275-4945-9e91-0b2c2844ae5d\") " Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332817 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-service-ca" (OuterVolumeSpecName: "service-ca") pod "9657873d-9275-4945-9e91-0b2c2844ae5d" (UID: "9657873d-9275-4945-9e91-0b2c2844ae5d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.332899 4919 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.333513 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9657873d-9275-4945-9e91-0b2c2844ae5d" (UID: "9657873d-9275-4945-9e91-0b2c2844ae5d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.333535 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9657873d-9275-4945-9e91-0b2c2844ae5d" (UID: "9657873d-9275-4945-9e91-0b2c2844ae5d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.333657 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-console-config" (OuterVolumeSpecName: "console-config") pod "9657873d-9275-4945-9e91-0b2c2844ae5d" (UID: "9657873d-9275-4945-9e91-0b2c2844ae5d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.337435 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9657873d-9275-4945-9e91-0b2c2844ae5d" (UID: "9657873d-9275-4945-9e91-0b2c2844ae5d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.337783 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9657873d-9275-4945-9e91-0b2c2844ae5d" (UID: "9657873d-9275-4945-9e91-0b2c2844ae5d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.337979 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9657873d-9275-4945-9e91-0b2c2844ae5d-kube-api-access-8thhg" (OuterVolumeSpecName: "kube-api-access-8thhg") pod "9657873d-9275-4945-9e91-0b2c2844ae5d" (UID: "9657873d-9275-4945-9e91-0b2c2844ae5d"). InnerVolumeSpecName "kube-api-access-8thhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.434411 4919 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.434449 4919 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.434462 4919 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.434474 4919 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9657873d-9275-4945-9e91-0b2c2844ae5d-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.434485 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8thhg\" (UniqueName: \"kubernetes.io/projected/9657873d-9275-4945-9e91-0b2c2844ae5d-kube-api-access-8thhg\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.434502 4919 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9657873d-9275-4945-9e91-0b2c2844ae5d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.947541 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-58nxf_9657873d-9275-4945-9e91-0b2c2844ae5d/console/0.log" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.947862 4919 generic.go:334] "Generic (PLEG): container finished" podID="9657873d-9275-4945-9e91-0b2c2844ae5d" containerID="49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360" exitCode=2 Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.947995 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-58nxf" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.948011 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-58nxf" event={"ID":"9657873d-9275-4945-9e91-0b2c2844ae5d","Type":"ContainerDied","Data":"49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360"} Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.948046 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-58nxf" event={"ID":"9657873d-9275-4945-9e91-0b2c2844ae5d","Type":"ContainerDied","Data":"955ca8edae6c79d599f2bf02aecbfedc235309a0fb0e56cd8be38b4fb21c310e"} Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.948089 4919 scope.go:117] "RemoveContainer" containerID="49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.952792 4919 generic.go:334] "Generic (PLEG): container finished" podID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerID="caecb98439dfb2e3bf7d157735436de12fbe3841a38c58c404150975312f7ae1" exitCode=0 Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.952841 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" event={"ID":"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6","Type":"ContainerDied","Data":"caecb98439dfb2e3bf7d157735436de12fbe3841a38c58c404150975312f7ae1"} Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.952879 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" event={"ID":"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6","Type":"ContainerStarted","Data":"ad655a9c0693d0128dbc41014b631d32c9e32b45f9d2528141ec23bed29df2fe"} Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.992628 4919 scope.go:117] "RemoveContainer" containerID="49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360" Mar 10 22:05:59 crc kubenswrapper[4919]: E0310 22:05:59.993448 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360\": container with ID starting with 49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360 not found: ID does not exist" containerID="49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.993500 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360"} err="failed to get container status \"49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360\": rpc error: code = NotFound desc = could not find container \"49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360\": container with ID starting with 49dea5373bf4345ccbd7205fbe849d1a9cb7190d51663ab0cba4be26e3915360 not found: ID does not exist" Mar 10 22:05:59 crc kubenswrapper[4919]: I0310 22:05:59.995039 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-58nxf"] Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.000033 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-58nxf"] Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.136048 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553006-kvjk2"] Mar 10 22:06:00 crc kubenswrapper[4919]: E0310 22:06:00.136296 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9657873d-9275-4945-9e91-0b2c2844ae5d" containerName="console" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.136310 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9657873d-9275-4945-9e91-0b2c2844ae5d" containerName="console" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.136479 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9657873d-9275-4945-9e91-0b2c2844ae5d" containerName="console" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.136949 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553006-kvjk2" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.138213 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.139211 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.139710 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.194133 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553006-kvjk2"] Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.244647 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nldj\" (UniqueName: \"kubernetes.io/projected/1705da29-6c23-4b70-881e-7e8268197e07-kube-api-access-9nldj\") pod \"auto-csr-approver-29553006-kvjk2\" (UID: \"1705da29-6c23-4b70-881e-7e8268197e07\") " pod="openshift-infra/auto-csr-approver-29553006-kvjk2" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.345453 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nldj\" (UniqueName: \"kubernetes.io/projected/1705da29-6c23-4b70-881e-7e8268197e07-kube-api-access-9nldj\") pod \"auto-csr-approver-29553006-kvjk2\" (UID: \"1705da29-6c23-4b70-881e-7e8268197e07\") " pod="openshift-infra/auto-csr-approver-29553006-kvjk2" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.364765 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nldj\" (UniqueName: \"kubernetes.io/projected/1705da29-6c23-4b70-881e-7e8268197e07-kube-api-access-9nldj\") pod \"auto-csr-approver-29553006-kvjk2\" (UID: \"1705da29-6c23-4b70-881e-7e8268197e07\") " pod="openshift-infra/auto-csr-approver-29553006-kvjk2" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.500041 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553006-kvjk2" Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.718600 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553006-kvjk2"] Mar 10 22:06:00 crc kubenswrapper[4919]: W0310 22:06:00.728241 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1705da29_6c23_4b70_881e_7e8268197e07.slice/crio-5c28dd6423de5aa7d9d46897a202839c1c11e80ac3817103ee51f67bc91fc7a4 WatchSource:0}: Error finding container 5c28dd6423de5aa7d9d46897a202839c1c11e80ac3817103ee51f67bc91fc7a4: Status 404 returned error can't find the container with id 5c28dd6423de5aa7d9d46897a202839c1c11e80ac3817103ee51f67bc91fc7a4 Mar 10 22:06:00 crc kubenswrapper[4919]: I0310 22:06:00.961817 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553006-kvjk2" event={"ID":"1705da29-6c23-4b70-881e-7e8268197e07","Type":"ContainerStarted","Data":"5c28dd6423de5aa7d9d46897a202839c1c11e80ac3817103ee51f67bc91fc7a4"} Mar 10 22:06:01 crc kubenswrapper[4919]: I0310 22:06:01.493506 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9657873d-9275-4945-9e91-0b2c2844ae5d" path="/var/lib/kubelet/pods/9657873d-9275-4945-9e91-0b2c2844ae5d/volumes" Mar 10 22:06:01 crc kubenswrapper[4919]: I0310 22:06:01.976314 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" event={"ID":"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6","Type":"ContainerStarted","Data":"05e8debb18cea85b36055bc3ad81090c8481c8e93dcb7af51dd0ec101be44f8d"} Mar 10 22:06:02 crc kubenswrapper[4919]: I0310 22:06:02.985546 4919 generic.go:334] "Generic (PLEG): container finished" podID="1705da29-6c23-4b70-881e-7e8268197e07" containerID="e96f1bee3c9c99f93c624b934359fe8d0bc3b475634cab74960b17882e799afb" exitCode=0 Mar 10 22:06:02 crc kubenswrapper[4919]: I0310 22:06:02.985708 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553006-kvjk2" event={"ID":"1705da29-6c23-4b70-881e-7e8268197e07","Type":"ContainerDied","Data":"e96f1bee3c9c99f93c624b934359fe8d0bc3b475634cab74960b17882e799afb"} Mar 10 22:06:02 crc kubenswrapper[4919]: I0310 22:06:02.987657 4919 generic.go:334] "Generic (PLEG): container finished" podID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerID="05e8debb18cea85b36055bc3ad81090c8481c8e93dcb7af51dd0ec101be44f8d" exitCode=0 Mar 10 22:06:02 crc kubenswrapper[4919]: I0310 22:06:02.987687 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" event={"ID":"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6","Type":"ContainerDied","Data":"05e8debb18cea85b36055bc3ad81090c8481c8e93dcb7af51dd0ec101be44f8d"} Mar 10 22:06:03 crc kubenswrapper[4919]: I0310 22:06:03.995695 4919 generic.go:334] "Generic (PLEG): container finished" podID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerID="2cb50b469318ac1f36105081fd0f3e77feaf9bcf3e139bb2d9886d3791c6a6e7" exitCode=0 Mar 10 22:06:03 crc kubenswrapper[4919]: I0310 22:06:03.995802 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" event={"ID":"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6","Type":"ContainerDied","Data":"2cb50b469318ac1f36105081fd0f3e77feaf9bcf3e139bb2d9886d3791c6a6e7"} Mar 10 22:06:04 crc kubenswrapper[4919]: I0310 22:06:04.277408 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553006-kvjk2" Mar 10 22:06:04 crc kubenswrapper[4919]: I0310 22:06:04.398222 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nldj\" (UniqueName: \"kubernetes.io/projected/1705da29-6c23-4b70-881e-7e8268197e07-kube-api-access-9nldj\") pod \"1705da29-6c23-4b70-881e-7e8268197e07\" (UID: \"1705da29-6c23-4b70-881e-7e8268197e07\") " Mar 10 22:06:04 crc kubenswrapper[4919]: I0310 22:06:04.403578 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1705da29-6c23-4b70-881e-7e8268197e07-kube-api-access-9nldj" (OuterVolumeSpecName: "kube-api-access-9nldj") pod "1705da29-6c23-4b70-881e-7e8268197e07" (UID: "1705da29-6c23-4b70-881e-7e8268197e07"). InnerVolumeSpecName "kube-api-access-9nldj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:06:04 crc kubenswrapper[4919]: I0310 22:06:04.499235 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nldj\" (UniqueName: \"kubernetes.io/projected/1705da29-6c23-4b70-881e-7e8268197e07-kube-api-access-9nldj\") on node \"crc\" DevicePath \"\"" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.002748 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553006-kvjk2" event={"ID":"1705da29-6c23-4b70-881e-7e8268197e07","Type":"ContainerDied","Data":"5c28dd6423de5aa7d9d46897a202839c1c11e80ac3817103ee51f67bc91fc7a4"} Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.002789 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c28dd6423de5aa7d9d46897a202839c1c11e80ac3817103ee51f67bc91fc7a4" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.002766 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553006-kvjk2" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.201856 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.308058 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-bundle\") pod \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.308117 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxj9v\" (UniqueName: \"kubernetes.io/projected/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-kube-api-access-fxj9v\") pod \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.308200 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-util\") pod \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\" (UID: \"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6\") " Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.309791 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-bundle" (OuterVolumeSpecName: "bundle") pod "6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" (UID: "6a1cfdf8-3455-43cb-9462-f4ad3632c7c6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.318040 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-util" (OuterVolumeSpecName: "util") pod "6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" (UID: "6a1cfdf8-3455-43cb-9462-f4ad3632c7c6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.318591 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-kube-api-access-fxj9v" (OuterVolumeSpecName: "kube-api-access-fxj9v") pod "6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" (UID: "6a1cfdf8-3455-43cb-9462-f4ad3632c7c6"). InnerVolumeSpecName "kube-api-access-fxj9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.330368 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553000-l9xjh"] Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.334444 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553000-l9xjh"] Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.409920 4919 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.409954 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxj9v\" (UniqueName: \"kubernetes.io/projected/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-kube-api-access-fxj9v\") on node \"crc\" DevicePath \"\"" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.409966 4919 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfdf8-3455-43cb-9462-f4ad3632c7c6-util\") on node \"crc\" DevicePath \"\"" Mar 10 22:06:05 crc kubenswrapper[4919]: I0310 22:06:05.486238 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4335ec6-e46a-4783-8d01-a1e84a33d2a7" path="/var/lib/kubelet/pods/d4335ec6-e46a-4783-8d01-a1e84a33d2a7/volumes" Mar 10 22:06:06 crc kubenswrapper[4919]: I0310 22:06:06.010632 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" event={"ID":"6a1cfdf8-3455-43cb-9462-f4ad3632c7c6","Type":"ContainerDied","Data":"ad655a9c0693d0128dbc41014b631d32c9e32b45f9d2528141ec23bed29df2fe"} Mar 10 22:06:06 crc kubenswrapper[4919]: I0310 22:06:06.010985 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad655a9c0693d0128dbc41014b631d32c9e32b45f9d2528141ec23bed29df2fe" Mar 10 22:06:06 crc kubenswrapper[4919]: I0310 22:06:06.010714 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.143381 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-696968d477-7csvb"] Mar 10 22:06:13 crc kubenswrapper[4919]: E0310 22:06:13.144245 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1705da29-6c23-4b70-881e-7e8268197e07" containerName="oc" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.144262 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1705da29-6c23-4b70-881e-7e8268197e07" containerName="oc" Mar 10 22:06:13 crc kubenswrapper[4919]: E0310 22:06:13.144279 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerName="util" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.144286 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerName="util" Mar 10 22:06:13 crc kubenswrapper[4919]: E0310 22:06:13.144296 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerName="pull" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.144304 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerName="pull" Mar 10 22:06:13 crc kubenswrapper[4919]: E0310 22:06:13.144320 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerName="extract" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.144328 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerName="extract" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.144486 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1705da29-6c23-4b70-881e-7e8268197e07" containerName="oc" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.144499 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1cfdf8-3455-43cb-9462-f4ad3632c7c6" containerName="extract" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.145018 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.147098 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rltxv" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.148688 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.148818 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.149134 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.154753 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180eb62c-34a6-4361-856a-419f01dc12df-webhook-cert\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.154784 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180eb62c-34a6-4361-856a-419f01dc12df-apiservice-cert\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.154819 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdzds\" (UniqueName: \"kubernetes.io/projected/180eb62c-34a6-4361-856a-419f01dc12df-kube-api-access-wdzds\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.159561 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.165330 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-696968d477-7csvb"] Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.255964 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdzds\" (UniqueName: \"kubernetes.io/projected/180eb62c-34a6-4361-856a-419f01dc12df-kube-api-access-wdzds\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.256109 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180eb62c-34a6-4361-856a-419f01dc12df-webhook-cert\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.256133 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180eb62c-34a6-4361-856a-419f01dc12df-apiservice-cert\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.262098 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180eb62c-34a6-4361-856a-419f01dc12df-webhook-cert\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.262542 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180eb62c-34a6-4361-856a-419f01dc12df-apiservice-cert\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.291133 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdzds\" (UniqueName: \"kubernetes.io/projected/180eb62c-34a6-4361-856a-419f01dc12df-kube-api-access-wdzds\") pod \"metallb-operator-controller-manager-696968d477-7csvb\" (UID: \"180eb62c-34a6-4361-856a-419f01dc12df\") " pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.414664 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n"] Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.415334 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.417614 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6vvbm" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.417656 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.417674 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.433475 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n"] Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.458153 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvcsb\" (UniqueName: \"kubernetes.io/projected/d80e3fd6-53cc-4b97-83c0-45a7b093a415-kube-api-access-gvcsb\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.458218 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d80e3fd6-53cc-4b97-83c0-45a7b093a415-apiservice-cert\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.458245 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d80e3fd6-53cc-4b97-83c0-45a7b093a415-webhook-cert\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.474419 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rltxv" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.483614 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.558916 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvcsb\" (UniqueName: \"kubernetes.io/projected/d80e3fd6-53cc-4b97-83c0-45a7b093a415-kube-api-access-gvcsb\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.559178 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d80e3fd6-53cc-4b97-83c0-45a7b093a415-apiservice-cert\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.559213 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d80e3fd6-53cc-4b97-83c0-45a7b093a415-webhook-cert\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.564727 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d80e3fd6-53cc-4b97-83c0-45a7b093a415-apiservice-cert\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.564884 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d80e3fd6-53cc-4b97-83c0-45a7b093a415-webhook-cert\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.590188 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvcsb\" (UniqueName: \"kubernetes.io/projected/d80e3fd6-53cc-4b97-83c0-45a7b093a415-kube-api-access-gvcsb\") pod \"metallb-operator-webhook-server-867bfdd6b6-88z9n\" (UID: \"d80e3fd6-53cc-4b97-83c0-45a7b093a415\") " pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.733687 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-696968d477-7csvb"] Mar 10 22:06:13 crc kubenswrapper[4919]: I0310 22:06:13.738549 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:14 crc kubenswrapper[4919]: I0310 22:06:14.061073 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" event={"ID":"180eb62c-34a6-4361-856a-419f01dc12df","Type":"ContainerStarted","Data":"a0e9bb06a93dd521f4abb9fcbda60e2dd2c610113010eb55019edaaafe4dc945"} Mar 10 22:06:14 crc kubenswrapper[4919]: I0310 22:06:14.163621 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n"] Mar 10 22:06:14 crc kubenswrapper[4919]: W0310 22:06:14.179123 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd80e3fd6_53cc_4b97_83c0_45a7b093a415.slice/crio-59c7c7691cd9ceed188c1e13784839b39875aa8c72246846410ab3c6b9019e98 WatchSource:0}: Error finding container 59c7c7691cd9ceed188c1e13784839b39875aa8c72246846410ab3c6b9019e98: Status 404 returned error can't find the container with id 59c7c7691cd9ceed188c1e13784839b39875aa8c72246846410ab3c6b9019e98 Mar 10 22:06:15 crc kubenswrapper[4919]: I0310 22:06:15.073342 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" event={"ID":"d80e3fd6-53cc-4b97-83c0-45a7b093a415","Type":"ContainerStarted","Data":"59c7c7691cd9ceed188c1e13784839b39875aa8c72246846410ab3c6b9019e98"} Mar 10 22:06:17 crc kubenswrapper[4919]: I0310 22:06:17.087563 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" event={"ID":"180eb62c-34a6-4361-856a-419f01dc12df","Type":"ContainerStarted","Data":"25f050c41d2313475ce3894d7c045961c7658faea7fa0f6de426a1922df19d56"} Mar 10 22:06:17 crc kubenswrapper[4919]: I0310 22:06:17.088171 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:17 crc kubenswrapper[4919]: I0310 22:06:17.115317 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" podStartSLOduration=1.308949375 podStartE2EDuration="4.115298803s" podCreationTimestamp="2026-03-10 22:06:13 +0000 UTC" firstStartedPulling="2026-03-10 22:06:13.746298872 +0000 UTC m=+960.988179480" lastFinishedPulling="2026-03-10 22:06:16.55264829 +0000 UTC m=+963.794528908" observedRunningTime="2026-03-10 22:06:17.110463891 +0000 UTC m=+964.352344539" watchObservedRunningTime="2026-03-10 22:06:17.115298803 +0000 UTC m=+964.357179421" Mar 10 22:06:21 crc kubenswrapper[4919]: I0310 22:06:21.135301 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" event={"ID":"d80e3fd6-53cc-4b97-83c0-45a7b093a415","Type":"ContainerStarted","Data":"87d1461b9110f15e934798469c8cb02607ebb05dcba7038fb4c4afda961b6200"} Mar 10 22:06:21 crc kubenswrapper[4919]: I0310 22:06:21.160550 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" podStartSLOduration=1.395057526 podStartE2EDuration="8.160528971s" podCreationTimestamp="2026-03-10 22:06:13 +0000 UTC" firstStartedPulling="2026-03-10 22:06:14.182263815 +0000 UTC m=+961.424144423" lastFinishedPulling="2026-03-10 22:06:20.94773526 +0000 UTC m=+968.189615868" observedRunningTime="2026-03-10 22:06:21.157294214 +0000 UTC m=+968.399174822" watchObservedRunningTime="2026-03-10 22:06:21.160528971 +0000 UTC m=+968.402409589" Mar 10 22:06:22 crc kubenswrapper[4919]: I0310 22:06:22.139966 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:24 crc kubenswrapper[4919]: I0310 22:06:24.243486 4919 scope.go:117] "RemoveContainer" containerID="589255cf712a6dec416e8cbd1ec5d52123e1297c21955116e955d4fb5773ff0c" Mar 10 22:06:28 crc kubenswrapper[4919]: I0310 22:06:28.958095 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jvl9s"] Mar 10 22:06:28 crc kubenswrapper[4919]: I0310 22:06:28.960800 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:28 crc kubenswrapper[4919]: I0310 22:06:28.979042 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvl9s"] Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.081775 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-catalog-content\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.081841 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-utilities\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.082004 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prrd6\" (UniqueName: \"kubernetes.io/projected/c7398cf1-a635-46af-930c-b8fdebac62f0-kube-api-access-prrd6\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.176011 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.176067 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.182797 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-catalog-content\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.182841 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-utilities\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.182892 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prrd6\" (UniqueName: \"kubernetes.io/projected/c7398cf1-a635-46af-930c-b8fdebac62f0-kube-api-access-prrd6\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.183373 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-utilities\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.183601 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-catalog-content\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.204895 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prrd6\" (UniqueName: \"kubernetes.io/projected/c7398cf1-a635-46af-930c-b8fdebac62f0-kube-api-access-prrd6\") pod \"certified-operators-jvl9s\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.292733 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:29 crc kubenswrapper[4919]: I0310 22:06:29.606868 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvl9s"] Mar 10 22:06:30 crc kubenswrapper[4919]: I0310 22:06:30.185290 4919 generic.go:334] "Generic (PLEG): container finished" podID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerID="96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745" exitCode=0 Mar 10 22:06:30 crc kubenswrapper[4919]: I0310 22:06:30.185671 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvl9s" event={"ID":"c7398cf1-a635-46af-930c-b8fdebac62f0","Type":"ContainerDied","Data":"96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745"} Mar 10 22:06:30 crc kubenswrapper[4919]: I0310 22:06:30.185704 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvl9s" event={"ID":"c7398cf1-a635-46af-930c-b8fdebac62f0","Type":"ContainerStarted","Data":"65a57a05f90536e9779a864e7846d4fcfa69ca37d4cdd473f1277e3beeb900e8"} Mar 10 22:06:32 crc kubenswrapper[4919]: I0310 22:06:32.200577 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvl9s" event={"ID":"c7398cf1-a635-46af-930c-b8fdebac62f0","Type":"ContainerStarted","Data":"7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c"} Mar 10 22:06:33 crc kubenswrapper[4919]: I0310 22:06:33.208504 4919 generic.go:334] "Generic (PLEG): container finished" podID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerID="7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c" exitCode=0 Mar 10 22:06:33 crc kubenswrapper[4919]: I0310 22:06:33.208799 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvl9s" event={"ID":"c7398cf1-a635-46af-930c-b8fdebac62f0","Type":"ContainerDied","Data":"7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c"} Mar 10 22:06:33 crc kubenswrapper[4919]: I0310 22:06:33.745318 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-867bfdd6b6-88z9n" Mar 10 22:06:34 crc kubenswrapper[4919]: I0310 22:06:34.218079 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvl9s" event={"ID":"c7398cf1-a635-46af-930c-b8fdebac62f0","Type":"ContainerStarted","Data":"688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a"} Mar 10 22:06:34 crc kubenswrapper[4919]: I0310 22:06:34.236742 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jvl9s" podStartSLOduration=2.742758315 podStartE2EDuration="6.236726879s" podCreationTimestamp="2026-03-10 22:06:28 +0000 UTC" firstStartedPulling="2026-03-10 22:06:30.18704964 +0000 UTC m=+977.428930248" lastFinishedPulling="2026-03-10 22:06:33.681018204 +0000 UTC m=+980.922898812" observedRunningTime="2026-03-10 22:06:34.234832628 +0000 UTC m=+981.476713246" watchObservedRunningTime="2026-03-10 22:06:34.236726879 +0000 UTC m=+981.478607487" Mar 10 22:06:39 crc kubenswrapper[4919]: I0310 22:06:39.294450 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:39 crc kubenswrapper[4919]: I0310 22:06:39.295172 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:39 crc kubenswrapper[4919]: I0310 22:06:39.353265 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:40 crc kubenswrapper[4919]: I0310 22:06:40.287206 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:40 crc kubenswrapper[4919]: I0310 22:06:40.337128 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jvl9s"] Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.261179 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jvl9s" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerName="registry-server" containerID="cri-o://688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a" gracePeriod=2 Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.665571 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.762237 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prrd6\" (UniqueName: \"kubernetes.io/projected/c7398cf1-a635-46af-930c-b8fdebac62f0-kube-api-access-prrd6\") pod \"c7398cf1-a635-46af-930c-b8fdebac62f0\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.762303 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-catalog-content\") pod \"c7398cf1-a635-46af-930c-b8fdebac62f0\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.762362 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-utilities\") pod \"c7398cf1-a635-46af-930c-b8fdebac62f0\" (UID: \"c7398cf1-a635-46af-930c-b8fdebac62f0\") " Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.763341 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-utilities" (OuterVolumeSpecName: "utilities") pod "c7398cf1-a635-46af-930c-b8fdebac62f0" (UID: "c7398cf1-a635-46af-930c-b8fdebac62f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.767315 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7398cf1-a635-46af-930c-b8fdebac62f0-kube-api-access-prrd6" (OuterVolumeSpecName: "kube-api-access-prrd6") pod "c7398cf1-a635-46af-930c-b8fdebac62f0" (UID: "c7398cf1-a635-46af-930c-b8fdebac62f0"). InnerVolumeSpecName "kube-api-access-prrd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.815382 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7398cf1-a635-46af-930c-b8fdebac62f0" (UID: "c7398cf1-a635-46af-930c-b8fdebac62f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.864349 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.864406 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prrd6\" (UniqueName: \"kubernetes.io/projected/c7398cf1-a635-46af-930c-b8fdebac62f0-kube-api-access-prrd6\") on node \"crc\" DevicePath \"\"" Mar 10 22:06:42 crc kubenswrapper[4919]: I0310 22:06:42.864423 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7398cf1-a635-46af-930c-b8fdebac62f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.270959 4919 generic.go:334] "Generic (PLEG): container finished" podID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerID="688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a" exitCode=0 Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.271058 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvl9s" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.271056 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvl9s" event={"ID":"c7398cf1-a635-46af-930c-b8fdebac62f0","Type":"ContainerDied","Data":"688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a"} Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.271670 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvl9s" event={"ID":"c7398cf1-a635-46af-930c-b8fdebac62f0","Type":"ContainerDied","Data":"65a57a05f90536e9779a864e7846d4fcfa69ca37d4cdd473f1277e3beeb900e8"} Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.271720 4919 scope.go:117] "RemoveContainer" containerID="688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.297322 4919 scope.go:117] "RemoveContainer" containerID="7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.312367 4919 scope.go:117] "RemoveContainer" containerID="96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.329372 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jvl9s"] Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.344643 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jvl9s"] Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.362691 4919 scope.go:117] "RemoveContainer" containerID="688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a" Mar 10 22:06:43 crc kubenswrapper[4919]: E0310 22:06:43.363156 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a\": container with ID starting with 688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a not found: ID does not exist" containerID="688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.363184 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a"} err="failed to get container status \"688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a\": rpc error: code = NotFound desc = could not find container \"688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a\": container with ID starting with 688a8354fe7623ec6a1d0421661282cf593db6232e74cda6e33c26a49d6aee1a not found: ID does not exist" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.363203 4919 scope.go:117] "RemoveContainer" containerID="7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c" Mar 10 22:06:43 crc kubenswrapper[4919]: E0310 22:06:43.363684 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c\": container with ID starting with 7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c not found: ID does not exist" containerID="7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.363705 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c"} err="failed to get container status \"7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c\": rpc error: code = NotFound desc = could not find container \"7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c\": container with ID starting with 7f95e00b5d020bcdb1c5829d53ac7528029001ae27469ac81c13e62c0b487c6c not found: ID does not exist" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.363717 4919 scope.go:117] "RemoveContainer" containerID="96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745" Mar 10 22:06:43 crc kubenswrapper[4919]: E0310 22:06:43.364071 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745\": container with ID starting with 96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745 not found: ID does not exist" containerID="96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.364093 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745"} err="failed to get container status \"96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745\": rpc error: code = NotFound desc = could not find container \"96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745\": container with ID starting with 96e555f21d491bdcc2c8de212f0fbc5658f193c41d380ab652a667c2bcaab745 not found: ID does not exist" Mar 10 22:06:43 crc kubenswrapper[4919]: I0310 22:06:43.490670 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" path="/var/lib/kubelet/pods/c7398cf1-a635-46af-930c-b8fdebac62f0/volumes" Mar 10 22:06:53 crc kubenswrapper[4919]: I0310 22:06:53.493738 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-696968d477-7csvb" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.353322 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-tfrn4"] Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.353844 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerName="extract-content" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.353892 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerName="extract-content" Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.353951 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerName="registry-server" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.353970 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerName="registry-server" Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.353994 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerName="extract-utilities" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.354011 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerName="extract-utilities" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.354296 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7398cf1-a635-46af-930c-b8fdebac62f0" containerName="registry-server" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.358363 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.360725 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-zcvfg" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.360880 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.364778 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.368497 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h"] Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.369306 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.371011 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.387001 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h"] Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.416641 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-frr-conf\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.416981 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-reloader\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.416999 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522e3074-d06d-4537-b6e4-cd60e9d7c216-cert\") pod \"frr-k8s-webhook-server-7f989f654f-l4f9h\" (UID: \"522e3074-d06d-4537-b6e4-cd60e9d7c216\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.417113 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-frr-sockets\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.417217 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fac61b-b480-427e-ba18-1c699bf5620a-metrics-certs\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.417244 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27c8v\" (UniqueName: \"kubernetes.io/projected/81fac61b-b480-427e-ba18-1c699bf5620a-kube-api-access-27c8v\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.417274 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-metrics\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.417300 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pncgk\" (UniqueName: \"kubernetes.io/projected/522e3074-d06d-4537-b6e4-cd60e9d7c216-kube-api-access-pncgk\") pod \"frr-k8s-webhook-server-7f989f654f-l4f9h\" (UID: \"522e3074-d06d-4537-b6e4-cd60e9d7c216\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.417364 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/81fac61b-b480-427e-ba18-1c699bf5620a-frr-startup\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.464509 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-j9g4p"] Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.465310 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.467003 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.467616 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.467830 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.468230 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-92dpg" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.500151 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-djm9l"] Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.501089 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.502607 4919 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.506993 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-djm9l"] Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518120 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czzcf\" (UniqueName: \"kubernetes.io/projected/e6950f65-bbda-4846-9826-042bd5dbaf87-kube-api-access-czzcf\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518175 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-metrics-certs\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518198 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e6950f65-bbda-4846-9826-042bd5dbaf87-metallb-excludel2\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518244 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fac61b-b480-427e-ba18-1c699bf5620a-metrics-certs\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518268 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27c8v\" (UniqueName: \"kubernetes.io/projected/81fac61b-b480-427e-ba18-1c699bf5620a-kube-api-access-27c8v\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518285 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-metrics\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518299 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pncgk\" (UniqueName: \"kubernetes.io/projected/522e3074-d06d-4537-b6e4-cd60e9d7c216-kube-api-access-pncgk\") pod \"frr-k8s-webhook-server-7f989f654f-l4f9h\" (UID: \"522e3074-d06d-4537-b6e4-cd60e9d7c216\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518324 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/81fac61b-b480-427e-ba18-1c699bf5620a-frr-startup\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518344 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-cert\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518375 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-frr-conf\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518431 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518475 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-reloader\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518503 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522e3074-d06d-4537-b6e4-cd60e9d7c216-cert\") pod \"frr-k8s-webhook-server-7f989f654f-l4f9h\" (UID: \"522e3074-d06d-4537-b6e4-cd60e9d7c216\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518541 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-metrics-certs\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518569 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgcnl\" (UniqueName: \"kubernetes.io/projected/1fc5122b-0945-47cf-8a35-cd496338269b-kube-api-access-cgcnl\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518597 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-frr-sockets\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.518964 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-frr-sockets\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.520497 4919 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.520550 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-metrics\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.520569 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81fac61b-b480-427e-ba18-1c699bf5620a-metrics-certs podName:81fac61b-b480-427e-ba18-1c699bf5620a nodeName:}" failed. No retries permitted until 2026-03-10 22:06:55.020550033 +0000 UTC m=+1002.262430751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81fac61b-b480-427e-ba18-1c699bf5620a-metrics-certs") pod "frr-k8s-tfrn4" (UID: "81fac61b-b480-427e-ba18-1c699bf5620a") : secret "frr-k8s-certs-secret" not found Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.521041 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-reloader\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.521262 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/81fac61b-b480-427e-ba18-1c699bf5620a-frr-conf\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.521310 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/81fac61b-b480-427e-ba18-1c699bf5620a-frr-startup\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.530118 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/522e3074-d06d-4537-b6e4-cd60e9d7c216-cert\") pod \"frr-k8s-webhook-server-7f989f654f-l4f9h\" (UID: \"522e3074-d06d-4537-b6e4-cd60e9d7c216\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.538258 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pncgk\" (UniqueName: \"kubernetes.io/projected/522e3074-d06d-4537-b6e4-cd60e9d7c216-kube-api-access-pncgk\") pod \"frr-k8s-webhook-server-7f989f654f-l4f9h\" (UID: \"522e3074-d06d-4537-b6e4-cd60e9d7c216\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.539381 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27c8v\" (UniqueName: \"kubernetes.io/projected/81fac61b-b480-427e-ba18-1c699bf5620a-kube-api-access-27c8v\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.619028 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e6950f65-bbda-4846-9826-042bd5dbaf87-metallb-excludel2\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.619105 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-cert\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.619165 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.619201 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-metrics-certs\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.619233 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgcnl\" (UniqueName: \"kubernetes.io/projected/1fc5122b-0945-47cf-8a35-cd496338269b-kube-api-access-cgcnl\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.619263 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czzcf\" (UniqueName: \"kubernetes.io/projected/e6950f65-bbda-4846-9826-042bd5dbaf87-kube-api-access-czzcf\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.619289 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-metrics-certs\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.619315 4919 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.619413 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist podName:e6950f65-bbda-4846-9826-042bd5dbaf87 nodeName:}" failed. No retries permitted until 2026-03-10 22:06:55.119371089 +0000 UTC m=+1002.361251777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist") pod "speaker-j9g4p" (UID: "e6950f65-bbda-4846-9826-042bd5dbaf87") : secret "metallb-memberlist" not found Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.619441 4919 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 10 22:06:54 crc kubenswrapper[4919]: E0310 22:06:54.619477 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-metrics-certs podName:1fc5122b-0945-47cf-8a35-cd496338269b nodeName:}" failed. No retries permitted until 2026-03-10 22:06:55.119467521 +0000 UTC m=+1002.361348129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-metrics-certs") pod "controller-86ddb6bd46-djm9l" (UID: "1fc5122b-0945-47cf-8a35-cd496338269b") : secret "controller-certs-secret" not found Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.619714 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e6950f65-bbda-4846-9826-042bd5dbaf87-metallb-excludel2\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.622864 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-metrics-certs\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.623778 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-cert\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.636826 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czzcf\" (UniqueName: \"kubernetes.io/projected/e6950f65-bbda-4846-9826-042bd5dbaf87-kube-api-access-czzcf\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.640924 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgcnl\" (UniqueName: \"kubernetes.io/projected/1fc5122b-0945-47cf-8a35-cd496338269b-kube-api-access-cgcnl\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:54 crc kubenswrapper[4919]: I0310 22:06:54.693611 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.029056 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fac61b-b480-427e-ba18-1c699bf5620a-metrics-certs\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.032683 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fac61b-b480-427e-ba18-1c699bf5620a-metrics-certs\") pod \"frr-k8s-tfrn4\" (UID: \"81fac61b-b480-427e-ba18-1c699bf5620a\") " pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.130050 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-metrics-certs\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.130148 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:55 crc kubenswrapper[4919]: E0310 22:06:55.130250 4919 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 22:06:55 crc kubenswrapper[4919]: E0310 22:06:55.130300 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist podName:e6950f65-bbda-4846-9826-042bd5dbaf87 nodeName:}" failed. No retries permitted until 2026-03-10 22:06:56.130285811 +0000 UTC m=+1003.372166419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist") pod "speaker-j9g4p" (UID: "e6950f65-bbda-4846-9826-042bd5dbaf87") : secret "metallb-memberlist" not found Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.133551 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fc5122b-0945-47cf-8a35-cd496338269b-metrics-certs\") pod \"controller-86ddb6bd46-djm9l\" (UID: \"1fc5122b-0945-47cf-8a35-cd496338269b\") " pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.145815 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h"] Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.282576 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.357049 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" event={"ID":"522e3074-d06d-4537-b6e4-cd60e9d7c216","Type":"ContainerStarted","Data":"086cec7628f6381c7a9c7c252eca0c476fa795fbfd9cc8b09700074c4d71e87a"} Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.414930 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:55 crc kubenswrapper[4919]: I0310 22:06:55.839905 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-djm9l"] Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.150367 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.156470 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e6950f65-bbda-4846-9826-042bd5dbaf87-memberlist\") pod \"speaker-j9g4p\" (UID: \"e6950f65-bbda-4846-9826-042bd5dbaf87\") " pod="metallb-system/speaker-j9g4p" Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.280370 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-j9g4p" Mar 10 22:06:56 crc kubenswrapper[4919]: W0310 22:06:56.296989 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6950f65_bbda_4846_9826_042bd5dbaf87.slice/crio-5d6ac4ea9505bc7a5f2801d9f240ff9f48ecc0de54e7f27b366a44b811ccad94 WatchSource:0}: Error finding container 5d6ac4ea9505bc7a5f2801d9f240ff9f48ecc0de54e7f27b366a44b811ccad94: Status 404 returned error can't find the container with id 5d6ac4ea9505bc7a5f2801d9f240ff9f48ecc0de54e7f27b366a44b811ccad94 Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.363287 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerStarted","Data":"cc3baa7af6955785f2ef5393bc063b102a8aebb81354c564170620171f07a34c"} Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.366863 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-djm9l" event={"ID":"1fc5122b-0945-47cf-8a35-cd496338269b","Type":"ContainerStarted","Data":"7045c96e0ad03e66caf8be804b10774e79052a4a684d0ab1c46d52f23ff2d6ce"} Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.366891 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-djm9l" event={"ID":"1fc5122b-0945-47cf-8a35-cd496338269b","Type":"ContainerStarted","Data":"fa390517c5c6d0715b788f643d7dfb61368efa051dc45355a31063f3176ee39f"} Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.366900 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-djm9l" event={"ID":"1fc5122b-0945-47cf-8a35-cd496338269b","Type":"ContainerStarted","Data":"63e421bb330b1760ec9eb0915af2841c0fdad74c181dcfc81833c8c8a90269c4"} Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.366927 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.380311 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j9g4p" event={"ID":"e6950f65-bbda-4846-9826-042bd5dbaf87","Type":"ContainerStarted","Data":"5d6ac4ea9505bc7a5f2801d9f240ff9f48ecc0de54e7f27b366a44b811ccad94"} Mar 10 22:06:56 crc kubenswrapper[4919]: I0310 22:06:56.385768 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-djm9l" podStartSLOduration=2.38575581 podStartE2EDuration="2.38575581s" podCreationTimestamp="2026-03-10 22:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:06:56.384339072 +0000 UTC m=+1003.626219670" watchObservedRunningTime="2026-03-10 22:06:56.38575581 +0000 UTC m=+1003.627636418" Mar 10 22:06:57 crc kubenswrapper[4919]: I0310 22:06:57.394341 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j9g4p" event={"ID":"e6950f65-bbda-4846-9826-042bd5dbaf87","Type":"ContainerStarted","Data":"16fcd65eb19d8a596d64fed38e791cad2fb42287065910dfd33a2394975728d7"} Mar 10 22:06:57 crc kubenswrapper[4919]: I0310 22:06:57.394717 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j9g4p" event={"ID":"e6950f65-bbda-4846-9826-042bd5dbaf87","Type":"ContainerStarted","Data":"995393264a974c343c9b61b199f81a50bd7ba4d176eb1a35a48d0cb093074d5b"} Mar 10 22:06:57 crc kubenswrapper[4919]: I0310 22:06:57.394850 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-j9g4p" Mar 10 22:06:57 crc kubenswrapper[4919]: I0310 22:06:57.427022 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-j9g4p" podStartSLOduration=3.426998021 podStartE2EDuration="3.426998021s" podCreationTimestamp="2026-03-10 22:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:06:57.423342861 +0000 UTC m=+1004.665223469" watchObservedRunningTime="2026-03-10 22:06:57.426998021 +0000 UTC m=+1004.668878629" Mar 10 22:06:59 crc kubenswrapper[4919]: I0310 22:06:59.176324 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:06:59 crc kubenswrapper[4919]: I0310 22:06:59.177212 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:06:59 crc kubenswrapper[4919]: I0310 22:06:59.177467 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:06:59 crc kubenswrapper[4919]: I0310 22:06:59.178180 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c0f64b8b2ef3b8561ca8ab7ca9e89321df88a87f472fe3592188e0b92020ed2"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:06:59 crc kubenswrapper[4919]: I0310 22:06:59.178350 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://5c0f64b8b2ef3b8561ca8ab7ca9e89321df88a87f472fe3592188e0b92020ed2" gracePeriod=600 Mar 10 22:06:59 crc kubenswrapper[4919]: I0310 22:06:59.434582 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="5c0f64b8b2ef3b8561ca8ab7ca9e89321df88a87f472fe3592188e0b92020ed2" exitCode=0 Mar 10 22:06:59 crc kubenswrapper[4919]: I0310 22:06:59.434785 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"5c0f64b8b2ef3b8561ca8ab7ca9e89321df88a87f472fe3592188e0b92020ed2"} Mar 10 22:06:59 crc kubenswrapper[4919]: I0310 22:06:59.435517 4919 scope.go:117] "RemoveContainer" containerID="1b4aa5b33b0728a2f664ae32a561328aa084e55b7c24f15b646a35b9a4014c13" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.446799 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"fe6790b4b646495ea90afaa8908c36e512ca4c07ed60f10561e041c0f1b0c857"} Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.712600 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c24vn"] Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.714126 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.720780 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c24vn"] Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.821082 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-utilities\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.821184 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-catalog-content\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.821232 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktwn\" (UniqueName: \"kubernetes.io/projected/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-kube-api-access-nktwn\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.923858 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktwn\" (UniqueName: \"kubernetes.io/projected/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-kube-api-access-nktwn\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.923938 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-utilities\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.923966 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-catalog-content\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.924748 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-catalog-content\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.925345 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-utilities\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:00 crc kubenswrapper[4919]: I0310 22:07:00.967443 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktwn\" (UniqueName: \"kubernetes.io/projected/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-kube-api-access-nktwn\") pod \"redhat-marketplace-c24vn\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:01 crc kubenswrapper[4919]: I0310 22:07:01.095750 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:03 crc kubenswrapper[4919]: I0310 22:07:03.465681 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" event={"ID":"522e3074-d06d-4537-b6e4-cd60e9d7c216","Type":"ContainerStarted","Data":"0c8f2341ec6f4c9e82d16f8ac0be359d5ca1e623f60b0ab685e3f46c69c498af"} Mar 10 22:07:03 crc kubenswrapper[4919]: I0310 22:07:03.466899 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:07:03 crc kubenswrapper[4919]: I0310 22:07:03.468206 4919 generic.go:334] "Generic (PLEG): container finished" podID="81fac61b-b480-427e-ba18-1c699bf5620a" containerID="49b5d6121bd858a8e42127e8a396d92915884f404f6e75ce32ae73dd0e8ff56d" exitCode=0 Mar 10 22:07:03 crc kubenswrapper[4919]: I0310 22:07:03.468232 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerDied","Data":"49b5d6121bd858a8e42127e8a396d92915884f404f6e75ce32ae73dd0e8ff56d"} Mar 10 22:07:03 crc kubenswrapper[4919]: I0310 22:07:03.485093 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" podStartSLOduration=1.353600307 podStartE2EDuration="9.485073154s" podCreationTimestamp="2026-03-10 22:06:54 +0000 UTC" firstStartedPulling="2026-03-10 22:06:55.153956062 +0000 UTC m=+1002.395836670" lastFinishedPulling="2026-03-10 22:07:03.285428909 +0000 UTC m=+1010.527309517" observedRunningTime="2026-03-10 22:07:03.483641465 +0000 UTC m=+1010.725522083" watchObservedRunningTime="2026-03-10 22:07:03.485073154 +0000 UTC m=+1010.726953762" Mar 10 22:07:03 crc kubenswrapper[4919]: I0310 22:07:03.562654 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c24vn"] Mar 10 22:07:04 crc kubenswrapper[4919]: I0310 22:07:04.487044 4919 generic.go:334] "Generic (PLEG): container finished" podID="81fac61b-b480-427e-ba18-1c699bf5620a" containerID="f9a9f5f34fedcbb0f34c0e34de2466607161f85db99f01318a711dce476475c7" exitCode=0 Mar 10 22:07:04 crc kubenswrapper[4919]: I0310 22:07:04.487335 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerDied","Data":"f9a9f5f34fedcbb0f34c0e34de2466607161f85db99f01318a711dce476475c7"} Mar 10 22:07:04 crc kubenswrapper[4919]: I0310 22:07:04.491162 4919 generic.go:334] "Generic (PLEG): container finished" podID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerID="d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753" exitCode=0 Mar 10 22:07:04 crc kubenswrapper[4919]: I0310 22:07:04.491477 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c24vn" event={"ID":"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed","Type":"ContainerDied","Data":"d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753"} Mar 10 22:07:04 crc kubenswrapper[4919]: I0310 22:07:04.491547 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c24vn" event={"ID":"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed","Type":"ContainerStarted","Data":"cda33e467ac37810d65e5317fce972ae93b336b41e4e33182181e1a300dd6a61"} Mar 10 22:07:05 crc kubenswrapper[4919]: I0310 22:07:05.420080 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-djm9l" Mar 10 22:07:05 crc kubenswrapper[4919]: I0310 22:07:05.497150 4919 generic.go:334] "Generic (PLEG): container finished" podID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerID="232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712" exitCode=0 Mar 10 22:07:05 crc kubenswrapper[4919]: I0310 22:07:05.497202 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c24vn" event={"ID":"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed","Type":"ContainerDied","Data":"232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712"} Mar 10 22:07:05 crc kubenswrapper[4919]: I0310 22:07:05.501660 4919 generic.go:334] "Generic (PLEG): container finished" podID="81fac61b-b480-427e-ba18-1c699bf5620a" containerID="dcfbf26dcb1df63f56a20afcf31b49d5c3250e4ecc6edbbebcdbaceeffa9fde1" exitCode=0 Mar 10 22:07:05 crc kubenswrapper[4919]: I0310 22:07:05.502330 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerDied","Data":"dcfbf26dcb1df63f56a20afcf31b49d5c3250e4ecc6edbbebcdbaceeffa9fde1"} Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.284309 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-j9g4p" Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.510150 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c24vn" event={"ID":"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed","Type":"ContainerStarted","Data":"a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd"} Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.514669 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerStarted","Data":"466d11cd72c5dd9183c0be48730fc0859f814e7c1d70d6be227102f70573db39"} Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.514706 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerStarted","Data":"e7fb5432c4b08ac69fe683d1cb6983b816f3b6633e6397bc4926d3651a1c2842"} Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.514716 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerStarted","Data":"8d93a1b8934a026ccf5c948b3c00b7a275546ea1fa309901354126094ded50c9"} Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.514724 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerStarted","Data":"4aefb5258c8f7817a7d8e0e6596cd576540066eaaf48dcddf05a3c9ea0cc4927"} Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.514732 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerStarted","Data":"36e6aeaf45bf54ee80d8a0797be39e3ba1960ee8df0cb0874fa81d09b0dca730"} Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.514740 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tfrn4" event={"ID":"81fac61b-b480-427e-ba18-1c699bf5620a","Type":"ContainerStarted","Data":"6af5cf0b36adfd890ca42cfb0b1a94bfcc5956e411dbc3782ceb872952cc2ba5"} Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.515333 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:07:06 crc kubenswrapper[4919]: I0310 22:07:06.566247 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c24vn" podStartSLOduration=5.128527468 podStartE2EDuration="6.566228581s" podCreationTimestamp="2026-03-10 22:07:00 +0000 UTC" firstStartedPulling="2026-03-10 22:07:04.494832612 +0000 UTC m=+1011.736713260" lastFinishedPulling="2026-03-10 22:07:05.932533765 +0000 UTC m=+1013.174414373" observedRunningTime="2026-03-10 22:07:06.528896701 +0000 UTC m=+1013.770777309" watchObservedRunningTime="2026-03-10 22:07:06.566228581 +0000 UTC m=+1013.808109189" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.074785 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-tfrn4" podStartSLOduration=6.205958817 podStartE2EDuration="14.074766893s" podCreationTimestamp="2026-03-10 22:06:54 +0000 UTC" firstStartedPulling="2026-03-10 22:06:55.399739796 +0000 UTC m=+1002.641620404" lastFinishedPulling="2026-03-10 22:07:03.268547872 +0000 UTC m=+1010.510428480" observedRunningTime="2026-03-10 22:07:06.564628248 +0000 UTC m=+1013.806508866" watchObservedRunningTime="2026-03-10 22:07:08.074766893 +0000 UTC m=+1015.316647501" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.077939 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2"] Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.079168 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.081039 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.089036 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2"] Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.229281 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lvv9\" (UniqueName: \"kubernetes.io/projected/e5fad1c3-3133-4fca-8614-ce814b312e72-kube-api-access-2lvv9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.229352 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.229452 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.330829 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.330897 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lvv9\" (UniqueName: \"kubernetes.io/projected/e5fad1c3-3133-4fca-8614-ce814b312e72-kube-api-access-2lvv9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.330938 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.331454 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.331477 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.348201 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lvv9\" (UniqueName: \"kubernetes.io/projected/e5fad1c3-3133-4fca-8614-ce814b312e72-kube-api-access-2lvv9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.401611 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:08 crc kubenswrapper[4919]: I0310 22:07:08.607966 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2"] Mar 10 22:07:08 crc kubenswrapper[4919]: W0310 22:07:08.612963 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5fad1c3_3133_4fca_8614_ce814b312e72.slice/crio-75dad33d05c9931d15f2c1f3771d583ca6d0122de55062013723846969d9b8f6 WatchSource:0}: Error finding container 75dad33d05c9931d15f2c1f3771d583ca6d0122de55062013723846969d9b8f6: Status 404 returned error can't find the container with id 75dad33d05c9931d15f2c1f3771d583ca6d0122de55062013723846969d9b8f6 Mar 10 22:07:09 crc kubenswrapper[4919]: I0310 22:07:09.537037 4919 generic.go:334] "Generic (PLEG): container finished" podID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerID="3ca6ec70dd3838caa4581267976e782555bf6acbdc7bdc2625967cf16409d3e3" exitCode=0 Mar 10 22:07:09 crc kubenswrapper[4919]: I0310 22:07:09.537081 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" event={"ID":"e5fad1c3-3133-4fca-8614-ce814b312e72","Type":"ContainerDied","Data":"3ca6ec70dd3838caa4581267976e782555bf6acbdc7bdc2625967cf16409d3e3"} Mar 10 22:07:09 crc kubenswrapper[4919]: I0310 22:07:09.537110 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" event={"ID":"e5fad1c3-3133-4fca-8614-ce814b312e72","Type":"ContainerStarted","Data":"75dad33d05c9931d15f2c1f3771d583ca6d0122de55062013723846969d9b8f6"} Mar 10 22:07:10 crc kubenswrapper[4919]: I0310 22:07:10.283836 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:07:10 crc kubenswrapper[4919]: I0310 22:07:10.325950 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:07:11 crc kubenswrapper[4919]: I0310 22:07:11.096168 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:11 crc kubenswrapper[4919]: I0310 22:07:11.096322 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:11 crc kubenswrapper[4919]: I0310 22:07:11.148596 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:11 crc kubenswrapper[4919]: I0310 22:07:11.587189 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:12 crc kubenswrapper[4919]: I0310 22:07:12.558808 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" event={"ID":"e5fad1c3-3133-4fca-8614-ce814b312e72","Type":"ContainerStarted","Data":"d618e64bc642ebfa01e0eec63e5d2ca4864f4c4e7041dc792dc346ffbd87b3f9"} Mar 10 22:07:13 crc kubenswrapper[4919]: I0310 22:07:13.431375 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c24vn"] Mar 10 22:07:13 crc kubenswrapper[4919]: I0310 22:07:13.568436 4919 generic.go:334] "Generic (PLEG): container finished" podID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerID="d618e64bc642ebfa01e0eec63e5d2ca4864f4c4e7041dc792dc346ffbd87b3f9" exitCode=0 Mar 10 22:07:13 crc kubenswrapper[4919]: I0310 22:07:13.568494 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" event={"ID":"e5fad1c3-3133-4fca-8614-ce814b312e72","Type":"ContainerDied","Data":"d618e64bc642ebfa01e0eec63e5d2ca4864f4c4e7041dc792dc346ffbd87b3f9"} Mar 10 22:07:14 crc kubenswrapper[4919]: I0310 22:07:14.575423 4919 generic.go:334] "Generic (PLEG): container finished" podID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerID="9d602cb0fafb8e12df505e2f2c2aed263b56a8177e6d0d23ff73924bb8eb0dcc" exitCode=0 Mar 10 22:07:14 crc kubenswrapper[4919]: I0310 22:07:14.575620 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c24vn" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerName="registry-server" containerID="cri-o://a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd" gracePeriod=2 Mar 10 22:07:14 crc kubenswrapper[4919]: I0310 22:07:14.575956 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" event={"ID":"e5fad1c3-3133-4fca-8614-ce814b312e72","Type":"ContainerDied","Data":"9d602cb0fafb8e12df505e2f2c2aed263b56a8177e6d0d23ff73924bb8eb0dcc"} Mar 10 22:07:14 crc kubenswrapper[4919]: I0310 22:07:14.700762 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-l4f9h" Mar 10 22:07:14 crc kubenswrapper[4919]: I0310 22:07:14.939266 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.025405 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-catalog-content\") pod \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.025521 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nktwn\" (UniqueName: \"kubernetes.io/projected/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-kube-api-access-nktwn\") pod \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.025573 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-utilities\") pod \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\" (UID: \"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed\") " Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.026603 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-utilities" (OuterVolumeSpecName: "utilities") pod "64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" (UID: "64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.035577 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-kube-api-access-nktwn" (OuterVolumeSpecName: "kube-api-access-nktwn") pod "64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" (UID: "64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed"). InnerVolumeSpecName "kube-api-access-nktwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.062256 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" (UID: "64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.126643 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nktwn\" (UniqueName: \"kubernetes.io/projected/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-kube-api-access-nktwn\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.126677 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.126689 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.286379 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-tfrn4" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.583647 4919 generic.go:334] "Generic (PLEG): container finished" podID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerID="a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd" exitCode=0 Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.583734 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c24vn" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.583765 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c24vn" event={"ID":"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed","Type":"ContainerDied","Data":"a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd"} Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.583801 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c24vn" event={"ID":"64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed","Type":"ContainerDied","Data":"cda33e467ac37810d65e5317fce972ae93b336b41e4e33182181e1a300dd6a61"} Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.583817 4919 scope.go:117] "RemoveContainer" containerID="a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.600500 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c24vn"] Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.603839 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c24vn"] Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.610575 4919 scope.go:117] "RemoveContainer" containerID="232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.639665 4919 scope.go:117] "RemoveContainer" containerID="d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.653139 4919 scope.go:117] "RemoveContainer" containerID="a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd" Mar 10 22:07:15 crc kubenswrapper[4919]: E0310 22:07:15.653541 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd\": container with ID starting with a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd not found: ID does not exist" containerID="a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.653623 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd"} err="failed to get container status \"a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd\": rpc error: code = NotFound desc = could not find container \"a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd\": container with ID starting with a5831f46ae7d8b96d2d054be3e9a7e6504627c66e706c84fc96fd087315a93fd not found: ID does not exist" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.653652 4919 scope.go:117] "RemoveContainer" containerID="232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712" Mar 10 22:07:15 crc kubenswrapper[4919]: E0310 22:07:15.654095 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712\": container with ID starting with 232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712 not found: ID does not exist" containerID="232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.654122 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712"} err="failed to get container status \"232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712\": rpc error: code = NotFound desc = could not find container \"232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712\": container with ID starting with 232c6b59b8034701997a752bff1b1ba09bd176d447fce60f4d1ec5c224df5712 not found: ID does not exist" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.654138 4919 scope.go:117] "RemoveContainer" containerID="d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753" Mar 10 22:07:15 crc kubenswrapper[4919]: E0310 22:07:15.654407 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753\": container with ID starting with d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753 not found: ID does not exist" containerID="d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.654433 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753"} err="failed to get container status \"d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753\": rpc error: code = NotFound desc = could not find container \"d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753\": container with ID starting with d4d7aeb42fb15be03be0d24d413d389aed06736ba73da1fc63e77b0aa21d4753 not found: ID does not exist" Mar 10 22:07:15 crc kubenswrapper[4919]: I0310 22:07:15.896534 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.036920 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-util\") pod \"e5fad1c3-3133-4fca-8614-ce814b312e72\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.036979 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-bundle\") pod \"e5fad1c3-3133-4fca-8614-ce814b312e72\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.037064 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lvv9\" (UniqueName: \"kubernetes.io/projected/e5fad1c3-3133-4fca-8614-ce814b312e72-kube-api-access-2lvv9\") pod \"e5fad1c3-3133-4fca-8614-ce814b312e72\" (UID: \"e5fad1c3-3133-4fca-8614-ce814b312e72\") " Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.038949 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-bundle" (OuterVolumeSpecName: "bundle") pod "e5fad1c3-3133-4fca-8614-ce814b312e72" (UID: "e5fad1c3-3133-4fca-8614-ce814b312e72"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.042263 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fad1c3-3133-4fca-8614-ce814b312e72-kube-api-access-2lvv9" (OuterVolumeSpecName: "kube-api-access-2lvv9") pod "e5fad1c3-3133-4fca-8614-ce814b312e72" (UID: "e5fad1c3-3133-4fca-8614-ce814b312e72"). InnerVolumeSpecName "kube-api-access-2lvv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.057209 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-util" (OuterVolumeSpecName: "util") pod "e5fad1c3-3133-4fca-8614-ce814b312e72" (UID: "e5fad1c3-3133-4fca-8614-ce814b312e72"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.139371 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lvv9\" (UniqueName: \"kubernetes.io/projected/e5fad1c3-3133-4fca-8614-ce814b312e72-kube-api-access-2lvv9\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.139846 4919 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-util\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.139910 4919 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5fad1c3-3133-4fca-8614-ce814b312e72-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.592604 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" event={"ID":"e5fad1c3-3133-4fca-8614-ce814b312e72","Type":"ContainerDied","Data":"75dad33d05c9931d15f2c1f3771d583ca6d0122de55062013723846969d9b8f6"} Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.592640 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75dad33d05c9931d15f2c1f3771d583ca6d0122de55062013723846969d9b8f6" Mar 10 22:07:16 crc kubenswrapper[4919]: I0310 22:07:16.592664 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2" Mar 10 22:07:17 crc kubenswrapper[4919]: I0310 22:07:17.487657 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" path="/var/lib/kubelet/pods/64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed/volumes" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.215846 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp"] Mar 10 22:07:21 crc kubenswrapper[4919]: E0310 22:07:21.216328 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerName="extract" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216341 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerName="extract" Mar 10 22:07:21 crc kubenswrapper[4919]: E0310 22:07:21.216354 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerName="extract-utilities" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216360 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerName="extract-utilities" Mar 10 22:07:21 crc kubenswrapper[4919]: E0310 22:07:21.216373 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerName="pull" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216380 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerName="pull" Mar 10 22:07:21 crc kubenswrapper[4919]: E0310 22:07:21.216399 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerName="registry-server" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216405 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerName="registry-server" Mar 10 22:07:21 crc kubenswrapper[4919]: E0310 22:07:21.216413 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerName="util" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216418 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerName="util" Mar 10 22:07:21 crc kubenswrapper[4919]: E0310 22:07:21.216427 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerName="extract-content" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216432 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerName="extract-content" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216524 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f5e4b5-f486-4eaf-8d20-8c8cb719d3ed" containerName="registry-server" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216540 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fad1c3-3133-4fca-8614-ce814b312e72" containerName="extract" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.216947 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.218883 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.219182 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.219382 4919 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-brjgn" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.231424 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp"] Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.303266 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhkp\" (UniqueName: \"kubernetes.io/projected/96a62942-206a-4c00-abc6-1b1af187852b-kube-api-access-7vhkp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vj2jp\" (UID: \"96a62942-206a-4c00-abc6-1b1af187852b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.303350 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96a62942-206a-4c00-abc6-1b1af187852b-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vj2jp\" (UID: \"96a62942-206a-4c00-abc6-1b1af187852b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.404444 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vhkp\" (UniqueName: \"kubernetes.io/projected/96a62942-206a-4c00-abc6-1b1af187852b-kube-api-access-7vhkp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vj2jp\" (UID: \"96a62942-206a-4c00-abc6-1b1af187852b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.404529 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96a62942-206a-4c00-abc6-1b1af187852b-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vj2jp\" (UID: \"96a62942-206a-4c00-abc6-1b1af187852b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.405241 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/96a62942-206a-4c00-abc6-1b1af187852b-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vj2jp\" (UID: \"96a62942-206a-4c00-abc6-1b1af187852b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.427136 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vhkp\" (UniqueName: \"kubernetes.io/projected/96a62942-206a-4c00-abc6-1b1af187852b-kube-api-access-7vhkp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vj2jp\" (UID: \"96a62942-206a-4c00-abc6-1b1af187852b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" Mar 10 22:07:21 crc kubenswrapper[4919]: I0310 22:07:21.532647 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" Mar 10 22:07:22 crc kubenswrapper[4919]: I0310 22:07:22.009023 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp"] Mar 10 22:07:22 crc kubenswrapper[4919]: W0310 22:07:22.016317 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a62942_206a_4c00_abc6_1b1af187852b.slice/crio-7bc65e490f880f485fa256871737fe63fde6dc06a61abf1834633b220bb5d922 WatchSource:0}: Error finding container 7bc65e490f880f485fa256871737fe63fde6dc06a61abf1834633b220bb5d922: Status 404 returned error can't find the container with id 7bc65e490f880f485fa256871737fe63fde6dc06a61abf1834633b220bb5d922 Mar 10 22:07:22 crc kubenswrapper[4919]: I0310 22:07:22.633357 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" event={"ID":"96a62942-206a-4c00-abc6-1b1af187852b","Type":"ContainerStarted","Data":"7bc65e490f880f485fa256871737fe63fde6dc06a61abf1834633b220bb5d922"} Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.639924 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ph9tr"] Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.642085 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.660501 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ph9tr"] Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.751611 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdpzx\" (UniqueName: \"kubernetes.io/projected/bf21050c-82f0-4b84-871c-955d2e8071b2-kube-api-access-sdpzx\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.751655 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-utilities\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.751706 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-catalog-content\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.852825 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdpzx\" (UniqueName: \"kubernetes.io/projected/bf21050c-82f0-4b84-871c-955d2e8071b2-kube-api-access-sdpzx\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.852862 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-utilities\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.852900 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-catalog-content\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.853298 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-catalog-content\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.853774 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-utilities\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.875070 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdpzx\" (UniqueName: \"kubernetes.io/projected/bf21050c-82f0-4b84-871c-955d2e8071b2-kube-api-access-sdpzx\") pod \"community-operators-ph9tr\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:24 crc kubenswrapper[4919]: I0310 22:07:24.963645 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:26 crc kubenswrapper[4919]: I0310 22:07:26.193171 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ph9tr"] Mar 10 22:07:26 crc kubenswrapper[4919]: W0310 22:07:26.202690 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf21050c_82f0_4b84_871c_955d2e8071b2.slice/crio-8e1269082e33207019ffd49f9ec5a5239733c4bc27dbbfed3589019c67bd5468 WatchSource:0}: Error finding container 8e1269082e33207019ffd49f9ec5a5239733c4bc27dbbfed3589019c67bd5468: Status 404 returned error can't find the container with id 8e1269082e33207019ffd49f9ec5a5239733c4bc27dbbfed3589019c67bd5468 Mar 10 22:07:26 crc kubenswrapper[4919]: I0310 22:07:26.658020 4919 generic.go:334] "Generic (PLEG): container finished" podID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerID="98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5" exitCode=0 Mar 10 22:07:26 crc kubenswrapper[4919]: I0310 22:07:26.658124 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph9tr" event={"ID":"bf21050c-82f0-4b84-871c-955d2e8071b2","Type":"ContainerDied","Data":"98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5"} Mar 10 22:07:26 crc kubenswrapper[4919]: I0310 22:07:26.658269 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph9tr" event={"ID":"bf21050c-82f0-4b84-871c-955d2e8071b2","Type":"ContainerStarted","Data":"8e1269082e33207019ffd49f9ec5a5239733c4bc27dbbfed3589019c67bd5468"} Mar 10 22:07:26 crc kubenswrapper[4919]: I0310 22:07:26.659857 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" event={"ID":"96a62942-206a-4c00-abc6-1b1af187852b","Type":"ContainerStarted","Data":"d26970635d756c45d0595520d85bc3d652ed9ce82e84d3fc08549a631cdf2492"} Mar 10 22:07:26 crc kubenswrapper[4919]: I0310 22:07:26.710512 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vj2jp" podStartSLOduration=1.6902314330000001 podStartE2EDuration="5.710493497s" podCreationTimestamp="2026-03-10 22:07:21 +0000 UTC" firstStartedPulling="2026-03-10 22:07:22.019518465 +0000 UTC m=+1029.261399063" lastFinishedPulling="2026-03-10 22:07:26.039780519 +0000 UTC m=+1033.281661127" observedRunningTime="2026-03-10 22:07:26.70839919 +0000 UTC m=+1033.950279798" watchObservedRunningTime="2026-03-10 22:07:26.710493497 +0000 UTC m=+1033.952374105" Mar 10 22:07:28 crc kubenswrapper[4919]: I0310 22:07:28.687731 4919 generic.go:334] "Generic (PLEG): container finished" podID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerID="d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6" exitCode=0 Mar 10 22:07:28 crc kubenswrapper[4919]: I0310 22:07:28.687785 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph9tr" event={"ID":"bf21050c-82f0-4b84-871c-955d2e8071b2","Type":"ContainerDied","Data":"d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6"} Mar 10 22:07:29 crc kubenswrapper[4919]: I0310 22:07:29.694917 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph9tr" event={"ID":"bf21050c-82f0-4b84-871c-955d2e8071b2","Type":"ContainerStarted","Data":"7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620"} Mar 10 22:07:29 crc kubenswrapper[4919]: I0310 22:07:29.710466 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ph9tr" podStartSLOduration=3.281997809 podStartE2EDuration="5.710452146s" podCreationTimestamp="2026-03-10 22:07:24 +0000 UTC" firstStartedPulling="2026-03-10 22:07:26.659488926 +0000 UTC m=+1033.901369534" lastFinishedPulling="2026-03-10 22:07:29.087943263 +0000 UTC m=+1036.329823871" observedRunningTime="2026-03-10 22:07:29.708658868 +0000 UTC m=+1036.950539476" watchObservedRunningTime="2026-03-10 22:07:29.710452146 +0000 UTC m=+1036.952332754" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.093310 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-c6lpm"] Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.094152 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.096566 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.096906 4919 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-d25pm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.097101 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.108972 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-c6lpm"] Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.249845 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-svlfd"] Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.250709 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.253787 4919 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vh2ct" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.255329 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck8s7\" (UniqueName: \"kubernetes.io/projected/11c3a36f-b65a-420e-aeaa-c1d372444660-kube-api-access-ck8s7\") pod \"cert-manager-webhook-6888856db4-c6lpm\" (UID: \"11c3a36f-b65a-420e-aeaa-c1d372444660\") " pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.255474 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11c3a36f-b65a-420e-aeaa-c1d372444660-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-c6lpm\" (UID: \"11c3a36f-b65a-420e-aeaa-c1d372444660\") " pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.262588 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-svlfd"] Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.357041 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/515ce4db-5f17-4b18-894e-f93e7f82459c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-svlfd\" (UID: \"515ce4db-5f17-4b18-894e-f93e7f82459c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.357083 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm7vm\" (UniqueName: \"kubernetes.io/projected/515ce4db-5f17-4b18-894e-f93e7f82459c-kube-api-access-bm7vm\") pod \"cert-manager-cainjector-5545bd876-svlfd\" (UID: \"515ce4db-5f17-4b18-894e-f93e7f82459c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.357116 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck8s7\" (UniqueName: \"kubernetes.io/projected/11c3a36f-b65a-420e-aeaa-c1d372444660-kube-api-access-ck8s7\") pod \"cert-manager-webhook-6888856db4-c6lpm\" (UID: \"11c3a36f-b65a-420e-aeaa-c1d372444660\") " pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.357150 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11c3a36f-b65a-420e-aeaa-c1d372444660-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-c6lpm\" (UID: \"11c3a36f-b65a-420e-aeaa-c1d372444660\") " pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.376240 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11c3a36f-b65a-420e-aeaa-c1d372444660-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-c6lpm\" (UID: \"11c3a36f-b65a-420e-aeaa-c1d372444660\") " pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.381814 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck8s7\" (UniqueName: \"kubernetes.io/projected/11c3a36f-b65a-420e-aeaa-c1d372444660-kube-api-access-ck8s7\") pod \"cert-manager-webhook-6888856db4-c6lpm\" (UID: \"11c3a36f-b65a-420e-aeaa-c1d372444660\") " pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.411142 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.458530 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/515ce4db-5f17-4b18-894e-f93e7f82459c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-svlfd\" (UID: \"515ce4db-5f17-4b18-894e-f93e7f82459c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.458892 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm7vm\" (UniqueName: \"kubernetes.io/projected/515ce4db-5f17-4b18-894e-f93e7f82459c-kube-api-access-bm7vm\") pod \"cert-manager-cainjector-5545bd876-svlfd\" (UID: \"515ce4db-5f17-4b18-894e-f93e7f82459c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.489714 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/515ce4db-5f17-4b18-894e-f93e7f82459c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-svlfd\" (UID: \"515ce4db-5f17-4b18-894e-f93e7f82459c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.490508 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm7vm\" (UniqueName: \"kubernetes.io/projected/515ce4db-5f17-4b18-894e-f93e7f82459c-kube-api-access-bm7vm\") pod \"cert-manager-cainjector-5545bd876-svlfd\" (UID: \"515ce4db-5f17-4b18-894e-f93e7f82459c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.565714 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.651680 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-c6lpm"] Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.713893 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" event={"ID":"11c3a36f-b65a-420e-aeaa-c1d372444660","Type":"ContainerStarted","Data":"af7fec62ece7813a43efc082c8fa4c137bf2835670d2ebc3791468f669efbed9"} Mar 10 22:07:32 crc kubenswrapper[4919]: I0310 22:07:32.978408 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-svlfd"] Mar 10 22:07:32 crc kubenswrapper[4919]: W0310 22:07:32.981265 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515ce4db_5f17_4b18_894e_f93e7f82459c.slice/crio-a4d7ade48dbde6ac7c9318f8af718f3db8baac3486a4319ba9bed17a4332707e WatchSource:0}: Error finding container a4d7ade48dbde6ac7c9318f8af718f3db8baac3486a4319ba9bed17a4332707e: Status 404 returned error can't find the container with id a4d7ade48dbde6ac7c9318f8af718f3db8baac3486a4319ba9bed17a4332707e Mar 10 22:07:33 crc kubenswrapper[4919]: I0310 22:07:33.720563 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" event={"ID":"515ce4db-5f17-4b18-894e-f93e7f82459c","Type":"ContainerStarted","Data":"a4d7ade48dbde6ac7c9318f8af718f3db8baac3486a4319ba9bed17a4332707e"} Mar 10 22:07:34 crc kubenswrapper[4919]: I0310 22:07:34.964179 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:34 crc kubenswrapper[4919]: I0310 22:07:34.964235 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:35 crc kubenswrapper[4919]: I0310 22:07:35.005593 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:35 crc kubenswrapper[4919]: I0310 22:07:35.777908 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:37 crc kubenswrapper[4919]: I0310 22:07:37.230923 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ph9tr"] Mar 10 22:07:37 crc kubenswrapper[4919]: I0310 22:07:37.744435 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ph9tr" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerName="registry-server" containerID="cri-o://7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620" gracePeriod=2 Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.111598 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.251932 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-utilities\") pod \"bf21050c-82f0-4b84-871c-955d2e8071b2\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.251986 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-catalog-content\") pod \"bf21050c-82f0-4b84-871c-955d2e8071b2\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.252068 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdpzx\" (UniqueName: \"kubernetes.io/projected/bf21050c-82f0-4b84-871c-955d2e8071b2-kube-api-access-sdpzx\") pod \"bf21050c-82f0-4b84-871c-955d2e8071b2\" (UID: \"bf21050c-82f0-4b84-871c-955d2e8071b2\") " Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.252798 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-utilities" (OuterVolumeSpecName: "utilities") pod "bf21050c-82f0-4b84-871c-955d2e8071b2" (UID: "bf21050c-82f0-4b84-871c-955d2e8071b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.257904 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf21050c-82f0-4b84-871c-955d2e8071b2-kube-api-access-sdpzx" (OuterVolumeSpecName: "kube-api-access-sdpzx") pod "bf21050c-82f0-4b84-871c-955d2e8071b2" (UID: "bf21050c-82f0-4b84-871c-955d2e8071b2"). InnerVolumeSpecName "kube-api-access-sdpzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.307853 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf21050c-82f0-4b84-871c-955d2e8071b2" (UID: "bf21050c-82f0-4b84-871c-955d2e8071b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.353041 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdpzx\" (UniqueName: \"kubernetes.io/projected/bf21050c-82f0-4b84-871c-955d2e8071b2-kube-api-access-sdpzx\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.353072 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.353083 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf21050c-82f0-4b84-871c-955d2e8071b2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.382442 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-tvl6b"] Mar 10 22:07:38 crc kubenswrapper[4919]: E0310 22:07:38.382675 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerName="extract-utilities" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.382688 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerName="extract-utilities" Mar 10 22:07:38 crc kubenswrapper[4919]: E0310 22:07:38.382697 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerName="registry-server" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.382703 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerName="registry-server" Mar 10 22:07:38 crc kubenswrapper[4919]: E0310 22:07:38.382727 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerName="extract-content" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.382734 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerName="extract-content" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.382830 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerName="registry-server" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.383183 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-tvl6b" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.385222 4919 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-95clj" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.395343 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-tvl6b"] Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.555864 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/febfa755-4560-47ec-9358-7a73e1336fb9-bound-sa-token\") pod \"cert-manager-545d4d4674-tvl6b\" (UID: \"febfa755-4560-47ec-9358-7a73e1336fb9\") " pod="cert-manager/cert-manager-545d4d4674-tvl6b" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.555964 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhz9\" (UniqueName: \"kubernetes.io/projected/febfa755-4560-47ec-9358-7a73e1336fb9-kube-api-access-5dhz9\") pod \"cert-manager-545d4d4674-tvl6b\" (UID: \"febfa755-4560-47ec-9358-7a73e1336fb9\") " pod="cert-manager/cert-manager-545d4d4674-tvl6b" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.657118 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhz9\" (UniqueName: \"kubernetes.io/projected/febfa755-4560-47ec-9358-7a73e1336fb9-kube-api-access-5dhz9\") pod \"cert-manager-545d4d4674-tvl6b\" (UID: \"febfa755-4560-47ec-9358-7a73e1336fb9\") " pod="cert-manager/cert-manager-545d4d4674-tvl6b" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.657515 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/febfa755-4560-47ec-9358-7a73e1336fb9-bound-sa-token\") pod \"cert-manager-545d4d4674-tvl6b\" (UID: \"febfa755-4560-47ec-9358-7a73e1336fb9\") " pod="cert-manager/cert-manager-545d4d4674-tvl6b" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.679636 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/febfa755-4560-47ec-9358-7a73e1336fb9-bound-sa-token\") pod \"cert-manager-545d4d4674-tvl6b\" (UID: \"febfa755-4560-47ec-9358-7a73e1336fb9\") " pod="cert-manager/cert-manager-545d4d4674-tvl6b" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.679976 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhz9\" (UniqueName: \"kubernetes.io/projected/febfa755-4560-47ec-9358-7a73e1336fb9-kube-api-access-5dhz9\") pod \"cert-manager-545d4d4674-tvl6b\" (UID: \"febfa755-4560-47ec-9358-7a73e1336fb9\") " pod="cert-manager/cert-manager-545d4d4674-tvl6b" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.713399 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-tvl6b" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.751974 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" event={"ID":"11c3a36f-b65a-420e-aeaa-c1d372444660","Type":"ContainerStarted","Data":"0d66c793c184f5319f87c729ed8ad55469b6a093954b2df6cefb1f7c154ccaca"} Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.752690 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.755731 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" event={"ID":"515ce4db-5f17-4b18-894e-f93e7f82459c","Type":"ContainerStarted","Data":"ccc0de0f1f11890f8a2218ab69388ebca582f385d69b1620a761ff966058ff1f"} Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.758507 4919 generic.go:334] "Generic (PLEG): container finished" podID="bf21050c-82f0-4b84-871c-955d2e8071b2" containerID="7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620" exitCode=0 Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.758617 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ph9tr" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.758568 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph9tr" event={"ID":"bf21050c-82f0-4b84-871c-955d2e8071b2","Type":"ContainerDied","Data":"7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620"} Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.758956 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph9tr" event={"ID":"bf21050c-82f0-4b84-871c-955d2e8071b2","Type":"ContainerDied","Data":"8e1269082e33207019ffd49f9ec5a5239733c4bc27dbbfed3589019c67bd5468"} Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.759030 4919 scope.go:117] "RemoveContainer" containerID="7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.767944 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" podStartSLOduration=1.673044194 podStartE2EDuration="6.767930443s" podCreationTimestamp="2026-03-10 22:07:32 +0000 UTC" firstStartedPulling="2026-03-10 22:07:32.664407216 +0000 UTC m=+1039.906287824" lastFinishedPulling="2026-03-10 22:07:37.759293465 +0000 UTC m=+1045.001174073" observedRunningTime="2026-03-10 22:07:38.766697019 +0000 UTC m=+1046.008577627" watchObservedRunningTime="2026-03-10 22:07:38.767930443 +0000 UTC m=+1046.009811051" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.786040 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-svlfd" podStartSLOduration=1.996062975 podStartE2EDuration="6.786026563s" podCreationTimestamp="2026-03-10 22:07:32 +0000 UTC" firstStartedPulling="2026-03-10 22:07:32.984172949 +0000 UTC m=+1040.226053577" lastFinishedPulling="2026-03-10 22:07:37.774136557 +0000 UTC m=+1045.016017165" observedRunningTime="2026-03-10 22:07:38.784479101 +0000 UTC m=+1046.026359709" watchObservedRunningTime="2026-03-10 22:07:38.786026563 +0000 UTC m=+1046.027907161" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.787334 4919 scope.go:117] "RemoveContainer" containerID="d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.804104 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ph9tr"] Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.811134 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ph9tr"] Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.861040 4919 scope.go:117] "RemoveContainer" containerID="98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.908188 4919 scope.go:117] "RemoveContainer" containerID="7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620" Mar 10 22:07:38 crc kubenswrapper[4919]: E0310 22:07:38.908737 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620\": container with ID starting with 7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620 not found: ID does not exist" containerID="7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.908766 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620"} err="failed to get container status \"7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620\": rpc error: code = NotFound desc = could not find container \"7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620\": container with ID starting with 7c8d3fed7d49012c05fc41dee833f86c49fe89eb328be8ec59b58a072592e620 not found: ID does not exist" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.908784 4919 scope.go:117] "RemoveContainer" containerID="d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6" Mar 10 22:07:38 crc kubenswrapper[4919]: E0310 22:07:38.908989 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6\": container with ID starting with d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6 not found: ID does not exist" containerID="d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.909010 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6"} err="failed to get container status \"d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6\": rpc error: code = NotFound desc = could not find container \"d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6\": container with ID starting with d2b70d420dbb6d3fe6aa51388882da949936f799c72ab998faa13c0dc54308e6 not found: ID does not exist" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.909021 4919 scope.go:117] "RemoveContainer" containerID="98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5" Mar 10 22:07:38 crc kubenswrapper[4919]: E0310 22:07:38.909309 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5\": container with ID starting with 98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5 not found: ID does not exist" containerID="98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5" Mar 10 22:07:38 crc kubenswrapper[4919]: I0310 22:07:38.909328 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5"} err="failed to get container status \"98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5\": rpc error: code = NotFound desc = could not find container \"98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5\": container with ID starting with 98e25df64499e012ffef377447dba0a42e693df6969e555f37f6f048ba633ac5 not found: ID does not exist" Mar 10 22:07:39 crc kubenswrapper[4919]: I0310 22:07:39.185325 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-tvl6b"] Mar 10 22:07:39 crc kubenswrapper[4919]: W0310 22:07:39.199154 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebfa755_4560_47ec_9358_7a73e1336fb9.slice/crio-4f08493c5623fa9eb73dc11db5ef1d9cce96037d27ef84096c9e0dc77ecbfbef WatchSource:0}: Error finding container 4f08493c5623fa9eb73dc11db5ef1d9cce96037d27ef84096c9e0dc77ecbfbef: Status 404 returned error can't find the container with id 4f08493c5623fa9eb73dc11db5ef1d9cce96037d27ef84096c9e0dc77ecbfbef Mar 10 22:07:39 crc kubenswrapper[4919]: I0310 22:07:39.490042 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf21050c-82f0-4b84-871c-955d2e8071b2" path="/var/lib/kubelet/pods/bf21050c-82f0-4b84-871c-955d2e8071b2/volumes" Mar 10 22:07:39 crc kubenswrapper[4919]: I0310 22:07:39.767531 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-tvl6b" event={"ID":"febfa755-4560-47ec-9358-7a73e1336fb9","Type":"ContainerStarted","Data":"ae33493c95eed98ea45550c5cd2912e5085d650ca40ab3921e237846b9a1f42c"} Mar 10 22:07:39 crc kubenswrapper[4919]: I0310 22:07:39.768000 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-tvl6b" event={"ID":"febfa755-4560-47ec-9358-7a73e1336fb9","Type":"ContainerStarted","Data":"4f08493c5623fa9eb73dc11db5ef1d9cce96037d27ef84096c9e0dc77ecbfbef"} Mar 10 22:07:39 crc kubenswrapper[4919]: I0310 22:07:39.786856 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-tvl6b" podStartSLOduration=1.786825358 podStartE2EDuration="1.786825358s" podCreationTimestamp="2026-03-10 22:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:07:39.782544233 +0000 UTC m=+1047.024424841" watchObservedRunningTime="2026-03-10 22:07:39.786825358 +0000 UTC m=+1047.028705996" Mar 10 22:07:42 crc kubenswrapper[4919]: I0310 22:07:42.414961 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-c6lpm" Mar 10 22:07:45 crc kubenswrapper[4919]: I0310 22:07:45.836433 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-l2bnk"] Mar 10 22:07:45 crc kubenswrapper[4919]: I0310 22:07:45.837628 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l2bnk" Mar 10 22:07:45 crc kubenswrapper[4919]: I0310 22:07:45.844181 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 22:07:45 crc kubenswrapper[4919]: I0310 22:07:45.844292 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 22:07:45 crc kubenswrapper[4919]: I0310 22:07:45.844605 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nszvr" Mar 10 22:07:45 crc kubenswrapper[4919]: I0310 22:07:45.865814 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-l2bnk"] Mar 10 22:07:45 crc kubenswrapper[4919]: I0310 22:07:45.971110 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nncw\" (UniqueName: \"kubernetes.io/projected/f0473584-1d6e-48ba-bf97-3df3b6426cfc-kube-api-access-6nncw\") pod \"openstack-operator-index-l2bnk\" (UID: \"f0473584-1d6e-48ba-bf97-3df3b6426cfc\") " pod="openstack-operators/openstack-operator-index-l2bnk" Mar 10 22:07:46 crc kubenswrapper[4919]: I0310 22:07:46.072013 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nncw\" (UniqueName: \"kubernetes.io/projected/f0473584-1d6e-48ba-bf97-3df3b6426cfc-kube-api-access-6nncw\") pod \"openstack-operator-index-l2bnk\" (UID: \"f0473584-1d6e-48ba-bf97-3df3b6426cfc\") " pod="openstack-operators/openstack-operator-index-l2bnk" Mar 10 22:07:46 crc kubenswrapper[4919]: I0310 22:07:46.090161 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nncw\" (UniqueName: \"kubernetes.io/projected/f0473584-1d6e-48ba-bf97-3df3b6426cfc-kube-api-access-6nncw\") pod \"openstack-operator-index-l2bnk\" (UID: \"f0473584-1d6e-48ba-bf97-3df3b6426cfc\") " pod="openstack-operators/openstack-operator-index-l2bnk" Mar 10 22:07:46 crc kubenswrapper[4919]: I0310 22:07:46.171717 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l2bnk" Mar 10 22:07:46 crc kubenswrapper[4919]: I0310 22:07:46.547639 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-l2bnk"] Mar 10 22:07:46 crc kubenswrapper[4919]: I0310 22:07:46.827426 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l2bnk" event={"ID":"f0473584-1d6e-48ba-bf97-3df3b6426cfc","Type":"ContainerStarted","Data":"475e2f8951088818475e5e2b160346e77f8c074f9d78c5a47606a902986f81c0"} Mar 10 22:07:48 crc kubenswrapper[4919]: I0310 22:07:48.845086 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l2bnk" event={"ID":"f0473584-1d6e-48ba-bf97-3df3b6426cfc","Type":"ContainerStarted","Data":"a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd"} Mar 10 22:07:49 crc kubenswrapper[4919]: I0310 22:07:49.213443 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-l2bnk" podStartSLOduration=2.41522143 podStartE2EDuration="4.21342714s" podCreationTimestamp="2026-03-10 22:07:45 +0000 UTC" firstStartedPulling="2026-03-10 22:07:46.558232411 +0000 UTC m=+1053.800113019" lastFinishedPulling="2026-03-10 22:07:48.356438111 +0000 UTC m=+1055.598318729" observedRunningTime="2026-03-10 22:07:48.869968855 +0000 UTC m=+1056.111849513" watchObservedRunningTime="2026-03-10 22:07:49.21342714 +0000 UTC m=+1056.455307738" Mar 10 22:07:49 crc kubenswrapper[4919]: I0310 22:07:49.213834 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-l2bnk"] Mar 10 22:07:49 crc kubenswrapper[4919]: I0310 22:07:49.820084 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nh4zc"] Mar 10 22:07:49 crc kubenswrapper[4919]: I0310 22:07:49.820786 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:07:49 crc kubenswrapper[4919]: I0310 22:07:49.836354 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nh4zc"] Mar 10 22:07:50 crc kubenswrapper[4919]: I0310 22:07:50.018073 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zljj6\" (UniqueName: \"kubernetes.io/projected/e0cd634b-f079-4a58-9ac7-7c4f7e90756f-kube-api-access-zljj6\") pod \"openstack-operator-index-nh4zc\" (UID: \"e0cd634b-f079-4a58-9ac7-7c4f7e90756f\") " pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:07:50 crc kubenswrapper[4919]: I0310 22:07:50.120276 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zljj6\" (UniqueName: \"kubernetes.io/projected/e0cd634b-f079-4a58-9ac7-7c4f7e90756f-kube-api-access-zljj6\") pod \"openstack-operator-index-nh4zc\" (UID: \"e0cd634b-f079-4a58-9ac7-7c4f7e90756f\") " pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:07:50 crc kubenswrapper[4919]: I0310 22:07:50.153054 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zljj6\" (UniqueName: \"kubernetes.io/projected/e0cd634b-f079-4a58-9ac7-7c4f7e90756f-kube-api-access-zljj6\") pod \"openstack-operator-index-nh4zc\" (UID: \"e0cd634b-f079-4a58-9ac7-7c4f7e90756f\") " pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:07:50 crc kubenswrapper[4919]: I0310 22:07:50.155353 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:07:50 crc kubenswrapper[4919]: I0310 22:07:50.606866 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nh4zc"] Mar 10 22:07:50 crc kubenswrapper[4919]: W0310 22:07:50.615320 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0cd634b_f079_4a58_9ac7_7c4f7e90756f.slice/crio-633cbf0abbb95857be6bcb2005db0a66a1a35fce4f87c1e3285872873fdb63ca WatchSource:0}: Error finding container 633cbf0abbb95857be6bcb2005db0a66a1a35fce4f87c1e3285872873fdb63ca: Status 404 returned error can't find the container with id 633cbf0abbb95857be6bcb2005db0a66a1a35fce4f87c1e3285872873fdb63ca Mar 10 22:07:50 crc kubenswrapper[4919]: I0310 22:07:50.862936 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nh4zc" event={"ID":"e0cd634b-f079-4a58-9ac7-7c4f7e90756f","Type":"ContainerStarted","Data":"633cbf0abbb95857be6bcb2005db0a66a1a35fce4f87c1e3285872873fdb63ca"} Mar 10 22:07:50 crc kubenswrapper[4919]: I0310 22:07:50.863137 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-l2bnk" podUID="f0473584-1d6e-48ba-bf97-3df3b6426cfc" containerName="registry-server" containerID="cri-o://a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd" gracePeriod=2 Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.281850 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l2bnk" Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.463652 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nncw\" (UniqueName: \"kubernetes.io/projected/f0473584-1d6e-48ba-bf97-3df3b6426cfc-kube-api-access-6nncw\") pod \"f0473584-1d6e-48ba-bf97-3df3b6426cfc\" (UID: \"f0473584-1d6e-48ba-bf97-3df3b6426cfc\") " Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.468895 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0473584-1d6e-48ba-bf97-3df3b6426cfc-kube-api-access-6nncw" (OuterVolumeSpecName: "kube-api-access-6nncw") pod "f0473584-1d6e-48ba-bf97-3df3b6426cfc" (UID: "f0473584-1d6e-48ba-bf97-3df3b6426cfc"). InnerVolumeSpecName "kube-api-access-6nncw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.564802 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nncw\" (UniqueName: \"kubernetes.io/projected/f0473584-1d6e-48ba-bf97-3df3b6426cfc-kube-api-access-6nncw\") on node \"crc\" DevicePath \"\"" Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.870890 4919 generic.go:334] "Generic (PLEG): container finished" podID="f0473584-1d6e-48ba-bf97-3df3b6426cfc" containerID="a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd" exitCode=0 Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.870934 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l2bnk" event={"ID":"f0473584-1d6e-48ba-bf97-3df3b6426cfc","Type":"ContainerDied","Data":"a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd"} Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.870964 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l2bnk" event={"ID":"f0473584-1d6e-48ba-bf97-3df3b6426cfc","Type":"ContainerDied","Data":"475e2f8951088818475e5e2b160346e77f8c074f9d78c5a47606a902986f81c0"} Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.870976 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l2bnk" Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.870985 4919 scope.go:117] "RemoveContainer" containerID="a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd" Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.892731 4919 scope.go:117] "RemoveContainer" containerID="a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd" Mar 10 22:07:51 crc kubenswrapper[4919]: E0310 22:07:51.893257 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd\": container with ID starting with a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd not found: ID does not exist" containerID="a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd" Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.893292 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd"} err="failed to get container status \"a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd\": rpc error: code = NotFound desc = could not find container \"a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd\": container with ID starting with a634a29f9100ae42374ab130953a60f60de61e3faf7ef2eafd80bb7cf39524dd not found: ID does not exist" Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.900997 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-l2bnk"] Mar 10 22:07:51 crc kubenswrapper[4919]: I0310 22:07:51.909277 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-l2bnk"] Mar 10 22:07:52 crc kubenswrapper[4919]: I0310 22:07:52.882222 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nh4zc" event={"ID":"e0cd634b-f079-4a58-9ac7-7c4f7e90756f","Type":"ContainerStarted","Data":"d62b38ca76888cc9d23e580a47dc974c01fe503bf36c65317a005b369b87e066"} Mar 10 22:07:52 crc kubenswrapper[4919]: I0310 22:07:52.904441 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nh4zc" podStartSLOduration=2.414554568 podStartE2EDuration="3.904374132s" podCreationTimestamp="2026-03-10 22:07:49 +0000 UTC" firstStartedPulling="2026-03-10 22:07:50.61890307 +0000 UTC m=+1057.860783688" lastFinishedPulling="2026-03-10 22:07:52.108722624 +0000 UTC m=+1059.350603252" observedRunningTime="2026-03-10 22:07:52.90246661 +0000 UTC m=+1060.144347288" watchObservedRunningTime="2026-03-10 22:07:52.904374132 +0000 UTC m=+1060.146254810" Mar 10 22:07:53 crc kubenswrapper[4919]: I0310 22:07:53.493585 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0473584-1d6e-48ba-bf97-3df3b6426cfc" path="/var/lib/kubelet/pods/f0473584-1d6e-48ba-bf97-3df3b6426cfc/volumes" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.129473 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553008-6m44j"] Mar 10 22:08:00 crc kubenswrapper[4919]: E0310 22:08:00.130174 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0473584-1d6e-48ba-bf97-3df3b6426cfc" containerName="registry-server" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.130248 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0473584-1d6e-48ba-bf97-3df3b6426cfc" containerName="registry-server" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.130354 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0473584-1d6e-48ba-bf97-3df3b6426cfc" containerName="registry-server" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.130781 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553008-6m44j" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.133607 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.134171 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.137692 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.142146 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553008-6m44j"] Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.156236 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.156295 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.190309 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzw5t\" (UniqueName: \"kubernetes.io/projected/a627d2b5-2999-44fc-a23a-af409711896c-kube-api-access-qzw5t\") pod \"auto-csr-approver-29553008-6m44j\" (UID: \"a627d2b5-2999-44fc-a23a-af409711896c\") " pod="openshift-infra/auto-csr-approver-29553008-6m44j" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.198168 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.291075 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzw5t\" (UniqueName: \"kubernetes.io/projected/a627d2b5-2999-44fc-a23a-af409711896c-kube-api-access-qzw5t\") pod \"auto-csr-approver-29553008-6m44j\" (UID: \"a627d2b5-2999-44fc-a23a-af409711896c\") " pod="openshift-infra/auto-csr-approver-29553008-6m44j" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.311588 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzw5t\" (UniqueName: \"kubernetes.io/projected/a627d2b5-2999-44fc-a23a-af409711896c-kube-api-access-qzw5t\") pod \"auto-csr-approver-29553008-6m44j\" (UID: \"a627d2b5-2999-44fc-a23a-af409711896c\") " pod="openshift-infra/auto-csr-approver-29553008-6m44j" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.449856 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553008-6m44j" Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.891960 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553008-6m44j"] Mar 10 22:08:00 crc kubenswrapper[4919]: W0310 22:08:00.895819 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda627d2b5_2999_44fc_a23a_af409711896c.slice/crio-1863cdb5c48dd10cea3bcb4437d508744ae898df27b1ab8658ae8473188c0630 WatchSource:0}: Error finding container 1863cdb5c48dd10cea3bcb4437d508744ae898df27b1ab8658ae8473188c0630: Status 404 returned error can't find the container with id 1863cdb5c48dd10cea3bcb4437d508744ae898df27b1ab8658ae8473188c0630 Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.945633 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553008-6m44j" event={"ID":"a627d2b5-2999-44fc-a23a-af409711896c","Type":"ContainerStarted","Data":"1863cdb5c48dd10cea3bcb4437d508744ae898df27b1ab8658ae8473188c0630"} Mar 10 22:08:00 crc kubenswrapper[4919]: I0310 22:08:00.975605 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nh4zc" Mar 10 22:08:02 crc kubenswrapper[4919]: I0310 22:08:02.969073 4919 generic.go:334] "Generic (PLEG): container finished" podID="a627d2b5-2999-44fc-a23a-af409711896c" containerID="67790202df1a8a341aa2320da216ec0d6f4a974001206eb49de031e559a5355d" exitCode=0 Mar 10 22:08:02 crc kubenswrapper[4919]: I0310 22:08:02.969145 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553008-6m44j" event={"ID":"a627d2b5-2999-44fc-a23a-af409711896c","Type":"ContainerDied","Data":"67790202df1a8a341aa2320da216ec0d6f4a974001206eb49de031e559a5355d"} Mar 10 22:08:04 crc kubenswrapper[4919]: I0310 22:08:04.286692 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553008-6m44j" Mar 10 22:08:04 crc kubenswrapper[4919]: I0310 22:08:04.342691 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzw5t\" (UniqueName: \"kubernetes.io/projected/a627d2b5-2999-44fc-a23a-af409711896c-kube-api-access-qzw5t\") pod \"a627d2b5-2999-44fc-a23a-af409711896c\" (UID: \"a627d2b5-2999-44fc-a23a-af409711896c\") " Mar 10 22:08:04 crc kubenswrapper[4919]: I0310 22:08:04.349682 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a627d2b5-2999-44fc-a23a-af409711896c-kube-api-access-qzw5t" (OuterVolumeSpecName: "kube-api-access-qzw5t") pod "a627d2b5-2999-44fc-a23a-af409711896c" (UID: "a627d2b5-2999-44fc-a23a-af409711896c"). InnerVolumeSpecName "kube-api-access-qzw5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:08:04 crc kubenswrapper[4919]: I0310 22:08:04.444424 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzw5t\" (UniqueName: \"kubernetes.io/projected/a627d2b5-2999-44fc-a23a-af409711896c-kube-api-access-qzw5t\") on node \"crc\" DevicePath \"\"" Mar 10 22:08:04 crc kubenswrapper[4919]: I0310 22:08:04.985632 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553008-6m44j" event={"ID":"a627d2b5-2999-44fc-a23a-af409711896c","Type":"ContainerDied","Data":"1863cdb5c48dd10cea3bcb4437d508744ae898df27b1ab8658ae8473188c0630"} Mar 10 22:08:04 crc kubenswrapper[4919]: I0310 22:08:04.985699 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1863cdb5c48dd10cea3bcb4437d508744ae898df27b1ab8658ae8473188c0630" Mar 10 22:08:04 crc kubenswrapper[4919]: I0310 22:08:04.985722 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553008-6m44j" Mar 10 22:08:05 crc kubenswrapper[4919]: I0310 22:08:05.348570 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553002-sskqr"] Mar 10 22:08:05 crc kubenswrapper[4919]: I0310 22:08:05.353230 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553002-sskqr"] Mar 10 22:08:05 crc kubenswrapper[4919]: I0310 22:08:05.493698 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="572158b3-be80-481d-84ff-0f1d9759aea8" path="/var/lib/kubelet/pods/572158b3-be80-481d-84ff-0f1d9759aea8/volumes" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.481948 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872"] Mar 10 22:08:06 crc kubenswrapper[4919]: E0310 22:08:06.482189 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a627d2b5-2999-44fc-a23a-af409711896c" containerName="oc" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.482201 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a627d2b5-2999-44fc-a23a-af409711896c" containerName="oc" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.482306 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a627d2b5-2999-44fc-a23a-af409711896c" containerName="oc" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.483106 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.484957 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gg22z" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.493525 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872"] Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.676167 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.676245 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.676336 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmw6\" (UniqueName: \"kubernetes.io/projected/b0b2027b-b395-4af2-967e-77bdd5ccc44e-kube-api-access-6wmw6\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.778041 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmw6\" (UniqueName: \"kubernetes.io/projected/b0b2027b-b395-4af2-967e-77bdd5ccc44e-kube-api-access-6wmw6\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.778143 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.778172 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.778616 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.778770 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.797351 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmw6\" (UniqueName: \"kubernetes.io/projected/b0b2027b-b395-4af2-967e-77bdd5ccc44e-kube-api-access-6wmw6\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:06 crc kubenswrapper[4919]: I0310 22:08:06.800176 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:07 crc kubenswrapper[4919]: I0310 22:08:07.009540 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872"] Mar 10 22:08:07 crc kubenswrapper[4919]: W0310 22:08:07.016077 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b2027b_b395_4af2_967e_77bdd5ccc44e.slice/crio-bc7d22be1fdec4fd71c800bad1736d6871af5f7fda49718adc3a4fd09d3f10b3 WatchSource:0}: Error finding container bc7d22be1fdec4fd71c800bad1736d6871af5f7fda49718adc3a4fd09d3f10b3: Status 404 returned error can't find the container with id bc7d22be1fdec4fd71c800bad1736d6871af5f7fda49718adc3a4fd09d3f10b3 Mar 10 22:08:08 crc kubenswrapper[4919]: I0310 22:08:08.004063 4919 generic.go:334] "Generic (PLEG): container finished" podID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerID="280ff4ab06495daf0ab35d78424780a03556cd5027de697d3c068a5a4e0a742e" exitCode=0 Mar 10 22:08:08 crc kubenswrapper[4919]: I0310 22:08:08.004179 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" event={"ID":"b0b2027b-b395-4af2-967e-77bdd5ccc44e","Type":"ContainerDied","Data":"280ff4ab06495daf0ab35d78424780a03556cd5027de697d3c068a5a4e0a742e"} Mar 10 22:08:08 crc kubenswrapper[4919]: I0310 22:08:08.004367 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" event={"ID":"b0b2027b-b395-4af2-967e-77bdd5ccc44e","Type":"ContainerStarted","Data":"bc7d22be1fdec4fd71c800bad1736d6871af5f7fda49718adc3a4fd09d3f10b3"} Mar 10 22:08:09 crc kubenswrapper[4919]: I0310 22:08:09.012513 4919 generic.go:334] "Generic (PLEG): container finished" podID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerID="75b55d47be05afb3bc08b19edbb9d7153cb22fad12354f58e7b36b8d0ff7e54b" exitCode=0 Mar 10 22:08:09 crc kubenswrapper[4919]: I0310 22:08:09.012563 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" event={"ID":"b0b2027b-b395-4af2-967e-77bdd5ccc44e","Type":"ContainerDied","Data":"75b55d47be05afb3bc08b19edbb9d7153cb22fad12354f58e7b36b8d0ff7e54b"} Mar 10 22:08:10 crc kubenswrapper[4919]: I0310 22:08:10.025385 4919 generic.go:334] "Generic (PLEG): container finished" podID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerID="3c7b6a7430f0b440bdb0c9c10a1a7d2b0fb8d5bd7bc425dd29ab3a196567ee67" exitCode=0 Mar 10 22:08:10 crc kubenswrapper[4919]: I0310 22:08:10.025497 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" event={"ID":"b0b2027b-b395-4af2-967e-77bdd5ccc44e","Type":"ContainerDied","Data":"3c7b6a7430f0b440bdb0c9c10a1a7d2b0fb8d5bd7bc425dd29ab3a196567ee67"} Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.379380 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.539209 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-util\") pod \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.539287 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wmw6\" (UniqueName: \"kubernetes.io/projected/b0b2027b-b395-4af2-967e-77bdd5ccc44e-kube-api-access-6wmw6\") pod \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.539434 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-bundle\") pod \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\" (UID: \"b0b2027b-b395-4af2-967e-77bdd5ccc44e\") " Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.540091 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-bundle" (OuterVolumeSpecName: "bundle") pod "b0b2027b-b395-4af2-967e-77bdd5ccc44e" (UID: "b0b2027b-b395-4af2-967e-77bdd5ccc44e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.544852 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b2027b-b395-4af2-967e-77bdd5ccc44e-kube-api-access-6wmw6" (OuterVolumeSpecName: "kube-api-access-6wmw6") pod "b0b2027b-b395-4af2-967e-77bdd5ccc44e" (UID: "b0b2027b-b395-4af2-967e-77bdd5ccc44e"). InnerVolumeSpecName "kube-api-access-6wmw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.555488 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-util" (OuterVolumeSpecName: "util") pod "b0b2027b-b395-4af2-967e-77bdd5ccc44e" (UID: "b0b2027b-b395-4af2-967e-77bdd5ccc44e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.641243 4919 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-util\") on node \"crc\" DevicePath \"\"" Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.641294 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wmw6\" (UniqueName: \"kubernetes.io/projected/b0b2027b-b395-4af2-967e-77bdd5ccc44e-kube-api-access-6wmw6\") on node \"crc\" DevicePath \"\"" Mar 10 22:08:11 crc kubenswrapper[4919]: I0310 22:08:11.641307 4919 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b2027b-b395-4af2-967e-77bdd5ccc44e-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:08:12 crc kubenswrapper[4919]: I0310 22:08:12.046735 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" event={"ID":"b0b2027b-b395-4af2-967e-77bdd5ccc44e","Type":"ContainerDied","Data":"bc7d22be1fdec4fd71c800bad1736d6871af5f7fda49718adc3a4fd09d3f10b3"} Mar 10 22:08:12 crc kubenswrapper[4919]: I0310 22:08:12.046792 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7d22be1fdec4fd71c800bad1736d6871af5f7fda49718adc3a4fd09d3f10b3" Mar 10 22:08:12 crc kubenswrapper[4919]: I0310 22:08:12.046789 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.749325 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj"] Mar 10 22:08:19 crc kubenswrapper[4919]: E0310 22:08:19.750085 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerName="util" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.750096 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerName="util" Mar 10 22:08:19 crc kubenswrapper[4919]: E0310 22:08:19.750118 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerName="pull" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.750125 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerName="pull" Mar 10 22:08:19 crc kubenswrapper[4919]: E0310 22:08:19.750131 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerName="extract" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.750138 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerName="extract" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.750250 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b2027b-b395-4af2-967e-77bdd5ccc44e" containerName="extract" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.750674 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.755273 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-zj46v" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.804315 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj"] Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.845725 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sj9b\" (UniqueName: \"kubernetes.io/projected/679d8d06-0146-48c0-b8dc-c26063604a77-kube-api-access-5sj9b\") pod \"openstack-operator-controller-init-6cf8df7788-mq4dj\" (UID: \"679d8d06-0146-48c0-b8dc-c26063604a77\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.947344 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sj9b\" (UniqueName: \"kubernetes.io/projected/679d8d06-0146-48c0-b8dc-c26063604a77-kube-api-access-5sj9b\") pod \"openstack-operator-controller-init-6cf8df7788-mq4dj\" (UID: \"679d8d06-0146-48c0-b8dc-c26063604a77\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" Mar 10 22:08:19 crc kubenswrapper[4919]: I0310 22:08:19.964467 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sj9b\" (UniqueName: \"kubernetes.io/projected/679d8d06-0146-48c0-b8dc-c26063604a77-kube-api-access-5sj9b\") pod \"openstack-operator-controller-init-6cf8df7788-mq4dj\" (UID: \"679d8d06-0146-48c0-b8dc-c26063604a77\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" Mar 10 22:08:20 crc kubenswrapper[4919]: I0310 22:08:20.065309 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" Mar 10 22:08:20 crc kubenswrapper[4919]: I0310 22:08:20.301872 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj"] Mar 10 22:08:21 crc kubenswrapper[4919]: I0310 22:08:21.133691 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" event={"ID":"679d8d06-0146-48c0-b8dc-c26063604a77","Type":"ContainerStarted","Data":"a24fa89361f20b4affd8aa01de15b2852dde7a2beeec45cc4bc34e0de5df4c83"} Mar 10 22:08:24 crc kubenswrapper[4919]: I0310 22:08:24.339566 4919 scope.go:117] "RemoveContainer" containerID="0c10e23c9b2882b4edd4df3ad3cf23e240305d6863d1e5596de8f4e2791db100" Mar 10 22:08:26 crc kubenswrapper[4919]: I0310 22:08:26.168135 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" event={"ID":"679d8d06-0146-48c0-b8dc-c26063604a77","Type":"ContainerStarted","Data":"d30498628768e42f463338a88519e0a9387366c45f2edfe7a0b7e1a68212527f"} Mar 10 22:08:26 crc kubenswrapper[4919]: I0310 22:08:26.168762 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" Mar 10 22:08:26 crc kubenswrapper[4919]: I0310 22:08:26.210629 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" podStartSLOduration=2.211991315 podStartE2EDuration="7.210605525s" podCreationTimestamp="2026-03-10 22:08:19 +0000 UTC" firstStartedPulling="2026-03-10 22:08:20.311283102 +0000 UTC m=+1087.553163710" lastFinishedPulling="2026-03-10 22:08:25.309897312 +0000 UTC m=+1092.551777920" observedRunningTime="2026-03-10 22:08:26.205199629 +0000 UTC m=+1093.447080277" watchObservedRunningTime="2026-03-10 22:08:26.210605525 +0000 UTC m=+1093.452486143" Mar 10 22:08:30 crc kubenswrapper[4919]: I0310 22:08:30.068510 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-mq4dj" Mar 10 22:08:59 crc kubenswrapper[4919]: I0310 22:08:59.176181 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:08:59 crc kubenswrapper[4919]: I0310 22:08:59.177527 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.234282 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.235553 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.238833 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.239526 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.246817 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-rkhjq" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.246994 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-twqg9" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.261814 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.266619 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.282534 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.283505 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.287780 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zp6zp" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.307444 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.334793 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.336315 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.338381 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vddrn" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.361986 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.362708 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.366843 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rhtxx" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.379958 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.386503 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxth4\" (UniqueName: \"kubernetes.io/projected/7fb7e7ce-d1a7-41e2-876e-42808a70c9e2-kube-api-access-bxth4\") pod \"cinder-operator-controller-manager-984cd4dcf-tm5dc\" (UID: \"7fb7e7ce-d1a7-41e2-876e-42808a70c9e2\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.386581 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk2vl\" (UniqueName: \"kubernetes.io/projected/25bf7b22-52f2-40ef-bd19-efe9c061e8b8-kube-api-access-pk2vl\") pod \"designate-operator-controller-manager-66d56f6ff4-tw8tv\" (UID: \"25bf7b22-52f2-40ef-bd19-efe9c061e8b8\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.386605 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvpd\" (UniqueName: \"kubernetes.io/projected/c6a398e1-5a44-4601-98c8-9ac478b0502c-kube-api-access-6rvpd\") pod \"barbican-operator-controller-manager-677bd678f7-k9qtp\" (UID: \"c6a398e1-5a44-4601-98c8-9ac478b0502c\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.398268 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.412420 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.413092 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.419662 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ctcpg" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.450483 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.459785 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.496534 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcndr\" (UniqueName: \"kubernetes.io/projected/3de5f2ad-6f5f-4e54-99ad-0d00736dfdab-kube-api-access-pcndr\") pod \"horizon-operator-controller-manager-6d9d6b584d-tsrgx\" (UID: \"3de5f2ad-6f5f-4e54-99ad-0d00736dfdab\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.496599 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45g7\" (UniqueName: \"kubernetes.io/projected/1b4083a5-cc88-4c92-9612-f06c0a36936d-kube-api-access-w45g7\") pod \"glance-operator-controller-manager-5964f64c48-5jgr2\" (UID: \"1b4083a5-cc88-4c92-9612-f06c0a36936d\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.496628 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxth4\" (UniqueName: \"kubernetes.io/projected/7fb7e7ce-d1a7-41e2-876e-42808a70c9e2-kube-api-access-bxth4\") pod \"cinder-operator-controller-manager-984cd4dcf-tm5dc\" (UID: \"7fb7e7ce-d1a7-41e2-876e-42808a70c9e2\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.496663 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf74f\" (UniqueName: \"kubernetes.io/projected/ce86157b-1544-4db8-8e8c-20d1ec8dde0d-kube-api-access-zf74f\") pod \"heat-operator-controller-manager-77b6666d85-4bbgm\" (UID: \"ce86157b-1544-4db8-8e8c-20d1ec8dde0d\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.496702 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk2vl\" (UniqueName: \"kubernetes.io/projected/25bf7b22-52f2-40ef-bd19-efe9c061e8b8-kube-api-access-pk2vl\") pod \"designate-operator-controller-manager-66d56f6ff4-tw8tv\" (UID: \"25bf7b22-52f2-40ef-bd19-efe9c061e8b8\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.496724 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvpd\" (UniqueName: \"kubernetes.io/projected/c6a398e1-5a44-4601-98c8-9ac478b0502c-kube-api-access-6rvpd\") pod \"barbican-operator-controller-manager-677bd678f7-k9qtp\" (UID: \"c6a398e1-5a44-4601-98c8-9ac478b0502c\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.497427 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.502690 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.503531 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.516453 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.516576 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7s57n" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.517434 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vd8jd" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.538292 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvpd\" (UniqueName: \"kubernetes.io/projected/c6a398e1-5a44-4601-98c8-9ac478b0502c-kube-api-access-6rvpd\") pod \"barbican-operator-controller-manager-677bd678f7-k9qtp\" (UID: \"c6a398e1-5a44-4601-98c8-9ac478b0502c\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.552882 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk2vl\" (UniqueName: \"kubernetes.io/projected/25bf7b22-52f2-40ef-bd19-efe9c061e8b8-kube-api-access-pk2vl\") pod \"designate-operator-controller-manager-66d56f6ff4-tw8tv\" (UID: \"25bf7b22-52f2-40ef-bd19-efe9c061e8b8\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.571530 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.585921 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.593218 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxth4\" (UniqueName: \"kubernetes.io/projected/7fb7e7ce-d1a7-41e2-876e-42808a70c9e2-kube-api-access-bxth4\") pod \"cinder-operator-controller-manager-984cd4dcf-tm5dc\" (UID: \"7fb7e7ce-d1a7-41e2-876e-42808a70c9e2\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.596441 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.597493 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.597987 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w45g7\" (UniqueName: \"kubernetes.io/projected/1b4083a5-cc88-4c92-9612-f06c0a36936d-kube-api-access-w45g7\") pod \"glance-operator-controller-manager-5964f64c48-5jgr2\" (UID: \"1b4083a5-cc88-4c92-9612-f06c0a36936d\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.598053 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf74f\" (UniqueName: \"kubernetes.io/projected/ce86157b-1544-4db8-8e8c-20d1ec8dde0d-kube-api-access-zf74f\") pod \"heat-operator-controller-manager-77b6666d85-4bbgm\" (UID: \"ce86157b-1544-4db8-8e8c-20d1ec8dde0d\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.598071 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.598121 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zs4m\" (UniqueName: \"kubernetes.io/projected/a197f90a-0c8f-47e6-ad18-f3c61cd51445-kube-api-access-6zs4m\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.598162 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgzv4\" (UniqueName: \"kubernetes.io/projected/8b54176b-b55a-43cd-9492-6f7d10b4e637-kube-api-access-pgzv4\") pod \"ironic-operator-controller-manager-6bbb499bbc-7gm62\" (UID: \"8b54176b-b55a-43cd-9492-6f7d10b4e637\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.598184 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcndr\" (UniqueName: \"kubernetes.io/projected/3de5f2ad-6f5f-4e54-99ad-0d00736dfdab-kube-api-access-pcndr\") pod \"horizon-operator-controller-manager-6d9d6b584d-tsrgx\" (UID: \"3de5f2ad-6f5f-4e54-99ad-0d00736dfdab\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.608856 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-kkwzk" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.623826 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.631474 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w45g7\" (UniqueName: \"kubernetes.io/projected/1b4083a5-cc88-4c92-9612-f06c0a36936d-kube-api-access-w45g7\") pod \"glance-operator-controller-manager-5964f64c48-5jgr2\" (UID: \"1b4083a5-cc88-4c92-9612-f06c0a36936d\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.653968 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf74f\" (UniqueName: \"kubernetes.io/projected/ce86157b-1544-4db8-8e8c-20d1ec8dde0d-kube-api-access-zf74f\") pod \"heat-operator-controller-manager-77b6666d85-4bbgm\" (UID: \"ce86157b-1544-4db8-8e8c-20d1ec8dde0d\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.672544 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.678158 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.683632 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcndr\" (UniqueName: \"kubernetes.io/projected/3de5f2ad-6f5f-4e54-99ad-0d00736dfdab-kube-api-access-pcndr\") pod \"horizon-operator-controller-manager-6d9d6b584d-tsrgx\" (UID: \"3de5f2ad-6f5f-4e54-99ad-0d00736dfdab\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.690067 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.699124 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.699208 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zs4m\" (UniqueName: \"kubernetes.io/projected/a197f90a-0c8f-47e6-ad18-f3c61cd51445-kube-api-access-6zs4m\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.699244 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr9rd\" (UniqueName: \"kubernetes.io/projected/200ec9b1-fcd0-4699-9002-7efdc5447a6d-kube-api-access-fr9rd\") pod \"keystone-operator-controller-manager-684f77d66d-b75jn\" (UID: \"200ec9b1-fcd0-4699-9002-7efdc5447a6d\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.699293 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgzv4\" (UniqueName: \"kubernetes.io/projected/8b54176b-b55a-43cd-9492-6f7d10b4e637-kube-api-access-pgzv4\") pod \"ironic-operator-controller-manager-6bbb499bbc-7gm62\" (UID: \"8b54176b-b55a-43cd-9492-6f7d10b4e637\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" Mar 10 22:09:08 crc kubenswrapper[4919]: E0310 22:09:08.699303 4919 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:08 crc kubenswrapper[4919]: E0310 22:09:08.699358 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert podName:a197f90a-0c8f-47e6-ad18-f3c61cd51445 nodeName:}" failed. No retries permitted until 2026-03-10 22:09:09.199340027 +0000 UTC m=+1136.441220625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert") pod "infra-operator-controller-manager-5995f4446f-sxg86" (UID: "a197f90a-0c8f-47e6-ad18-f3c61cd51445") : secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.723151 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgzv4\" (UniqueName: \"kubernetes.io/projected/8b54176b-b55a-43cd-9492-6f7d10b4e637-kube-api-access-pgzv4\") pod \"ironic-operator-controller-manager-6bbb499bbc-7gm62\" (UID: \"8b54176b-b55a-43cd-9492-6f7d10b4e637\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.723557 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zs4m\" (UniqueName: \"kubernetes.io/projected/a197f90a-0c8f-47e6-ad18-f3c61cd51445-kube-api-access-6zs4m\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.737647 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.744314 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.759349 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.760605 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.765224 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-qpjzg" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.772846 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.773980 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.778250 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hmpq7" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.783525 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.784376 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.787144 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zhbwx" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.792734 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.800886 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9rd\" (UniqueName: \"kubernetes.io/projected/200ec9b1-fcd0-4699-9002-7efdc5447a6d-kube-api-access-fr9rd\") pod \"keystone-operator-controller-manager-684f77d66d-b75jn\" (UID: \"200ec9b1-fcd0-4699-9002-7efdc5447a6d\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.801159 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.815451 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.816900 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.818173 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.826865 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-9pb2d" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.827930 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr9rd\" (UniqueName: \"kubernetes.io/projected/200ec9b1-fcd0-4699-9002-7efdc5447a6d-kube-api-access-fr9rd\") pod \"keystone-operator-controller-manager-684f77d66d-b75jn\" (UID: \"200ec9b1-fcd0-4699-9002-7efdc5447a6d\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.863706 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.874430 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.875269 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.880128 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-bpz6x" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.891892 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.895452 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.902664 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.903608 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.904679 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz8hj\" (UniqueName: \"kubernetes.io/projected/7c849cfd-ade6-46ce-80f0-09df981cdafd-kube-api-access-gz8hj\") pod \"manila-operator-controller-manager-68f45f9d9f-7fkgj\" (UID: \"7c849cfd-ade6-46ce-80f0-09df981cdafd\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.904734 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgnzw\" (UniqueName: \"kubernetes.io/projected/d6ac04fc-f3ea-4b69-aba1-b27490967c0e-kube-api-access-mgnzw\") pod \"nova-operator-controller-manager-569cc54c5-dk2nj\" (UID: \"d6ac04fc-f3ea-4b69-aba1-b27490967c0e\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.904760 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64fkt\" (UniqueName: \"kubernetes.io/projected/588d49a0-7a32-4b7a-be73-ec3897d4653b-kube-api-access-64fkt\") pod \"mariadb-operator-controller-manager-658d4cdd5-fjrhv\" (UID: \"588d49a0-7a32-4b7a-be73-ec3897d4653b\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.904813 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrkrf\" (UniqueName: \"kubernetes.io/projected/cbf180f9-e934-462e-926a-520b21f22550-kube-api-access-zrkrf\") pod \"neutron-operator-controller-manager-776c5696bf-tvmsl\" (UID: \"cbf180f9-e934-462e-926a-520b21f22550\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.911065 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ptb9s" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.912012 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.914196 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-drwxk"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.915116 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.922790 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-cghkr" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.922998 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.940280 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.941359 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.944076 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.944093 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cbxv2" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.945374 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-drwxk"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.972487 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.973313 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.976673 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-7t5q8" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.993459 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c"] Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.994282 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.997194 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-spmd9" Mar 10 22:09:08 crc kubenswrapper[4919]: I0310 22:09:08.999642 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.004230 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006091 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006137 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgnzw\" (UniqueName: \"kubernetes.io/projected/d6ac04fc-f3ea-4b69-aba1-b27490967c0e-kube-api-access-mgnzw\") pod \"nova-operator-controller-manager-569cc54c5-dk2nj\" (UID: \"d6ac04fc-f3ea-4b69-aba1-b27490967c0e\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006169 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64fkt\" (UniqueName: \"kubernetes.io/projected/588d49a0-7a32-4b7a-be73-ec3897d4653b-kube-api-access-64fkt\") pod \"mariadb-operator-controller-manager-658d4cdd5-fjrhv\" (UID: \"588d49a0-7a32-4b7a-be73-ec3897d4653b\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006192 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll8c9\" (UniqueName: \"kubernetes.io/projected/d9571b0c-bd43-4789-942b-f833e4166418-kube-api-access-ll8c9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7llx7\" (UID: \"d9571b0c-bd43-4789-942b-f833e4166418\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006217 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5gx2\" (UniqueName: \"kubernetes.io/projected/768191d9-b4b4-44da-a525-b2ba92e1ceea-kube-api-access-p5gx2\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006274 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrkrf\" (UniqueName: \"kubernetes.io/projected/cbf180f9-e934-462e-926a-520b21f22550-kube-api-access-zrkrf\") pod \"neutron-operator-controller-manager-776c5696bf-tvmsl\" (UID: \"cbf180f9-e934-462e-926a-520b21f22550\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006296 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrx2\" (UniqueName: \"kubernetes.io/projected/20945b47-e70f-45b2-b137-9525a0ec1b31-kube-api-access-mqrx2\") pod \"ovn-operator-controller-manager-bbc5b68f9-xxntv\" (UID: \"20945b47-e70f-45b2-b137-9525a0ec1b31\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006315 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99xnb\" (UniqueName: \"kubernetes.io/projected/2da3cd1a-93e3-4487-ab1f-b022662e06c0-kube-api-access-99xnb\") pod \"swift-operator-controller-manager-677c674df7-drwxk\" (UID: \"2da3cd1a-93e3-4487-ab1f-b022662e06c0\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.006344 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz8hj\" (UniqueName: \"kubernetes.io/projected/7c849cfd-ade6-46ce-80f0-09df981cdafd-kube-api-access-gz8hj\") pod \"manila-operator-controller-manager-68f45f9d9f-7fkgj\" (UID: \"7c849cfd-ade6-46ce-80f0-09df981cdafd\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.009470 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.015082 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.016147 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.020248 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.020746 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vmwh9" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.027928 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.053865 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64fkt\" (UniqueName: \"kubernetes.io/projected/588d49a0-7a32-4b7a-be73-ec3897d4653b-kube-api-access-64fkt\") pod \"mariadb-operator-controller-manager-658d4cdd5-fjrhv\" (UID: \"588d49a0-7a32-4b7a-be73-ec3897d4653b\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.054976 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz8hj\" (UniqueName: \"kubernetes.io/projected/7c849cfd-ade6-46ce-80f0-09df981cdafd-kube-api-access-gz8hj\") pod \"manila-operator-controller-manager-68f45f9d9f-7fkgj\" (UID: \"7c849cfd-ade6-46ce-80f0-09df981cdafd\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.064930 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgnzw\" (UniqueName: \"kubernetes.io/projected/d6ac04fc-f3ea-4b69-aba1-b27490967c0e-kube-api-access-mgnzw\") pod \"nova-operator-controller-manager-569cc54c5-dk2nj\" (UID: \"d6ac04fc-f3ea-4b69-aba1-b27490967c0e\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.065251 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrkrf\" (UniqueName: \"kubernetes.io/projected/cbf180f9-e934-462e-926a-520b21f22550-kube-api-access-zrkrf\") pod \"neutron-operator-controller-manager-776c5696bf-tvmsl\" (UID: \"cbf180f9-e934-462e-926a-520b21f22550\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.081469 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.085115 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.087873 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6mhkr" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.088019 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.097278 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.107102 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88zr\" (UniqueName: \"kubernetes.io/projected/512df763-915b-447f-b5c3-0756788212d6-kube-api-access-d88zr\") pod \"placement-operator-controller-manager-574d45c66c-9ccwt\" (UID: \"512df763-915b-447f-b5c3-0756788212d6\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.107146 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll8c9\" (UniqueName: \"kubernetes.io/projected/d9571b0c-bd43-4789-942b-f833e4166418-kube-api-access-ll8c9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7llx7\" (UID: \"d9571b0c-bd43-4789-942b-f833e4166418\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.107174 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5gx2\" (UniqueName: \"kubernetes.io/projected/768191d9-b4b4-44da-a525-b2ba92e1ceea-kube-api-access-p5gx2\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.107194 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv65f\" (UniqueName: \"kubernetes.io/projected/83ccb82d-2701-46e7-9aa9-3ed962bc31e0-kube-api-access-xv65f\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-dds4c\" (UID: \"83ccb82d-2701-46e7-9aa9-3ed962bc31e0\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.107230 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gfh\" (UniqueName: \"kubernetes.io/projected/fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04-kube-api-access-s7gfh\") pod \"test-operator-controller-manager-5c5cb9c4d7-kl9vj\" (UID: \"fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.107262 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrx2\" (UniqueName: \"kubernetes.io/projected/20945b47-e70f-45b2-b137-9525a0ec1b31-kube-api-access-mqrx2\") pod \"ovn-operator-controller-manager-bbc5b68f9-xxntv\" (UID: \"20945b47-e70f-45b2-b137-9525a0ec1b31\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.107279 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99xnb\" (UniqueName: \"kubernetes.io/projected/2da3cd1a-93e3-4487-ab1f-b022662e06c0-kube-api-access-99xnb\") pod \"swift-operator-controller-manager-677c674df7-drwxk\" (UID: \"2da3cd1a-93e3-4487-ab1f-b022662e06c0\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.107324 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.107442 4919 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.107484 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert podName:768191d9-b4b4-44da-a525-b2ba92e1ceea nodeName:}" failed. No retries permitted until 2026-03-10 22:09:09.607469745 +0000 UTC m=+1136.849350353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" (UID: "768191d9-b4b4-44da-a525-b2ba92e1ceea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.109817 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.131874 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.132505 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5gx2\" (UniqueName: \"kubernetes.io/projected/768191d9-b4b4-44da-a525-b2ba92e1ceea-kube-api-access-p5gx2\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.135158 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99xnb\" (UniqueName: \"kubernetes.io/projected/2da3cd1a-93e3-4487-ab1f-b022662e06c0-kube-api-access-99xnb\") pod \"swift-operator-controller-manager-677c674df7-drwxk\" (UID: \"2da3cd1a-93e3-4487-ab1f-b022662e06c0\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.138086 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll8c9\" (UniqueName: \"kubernetes.io/projected/d9571b0c-bd43-4789-942b-f833e4166418-kube-api-access-ll8c9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7llx7\" (UID: \"d9571b0c-bd43-4789-942b-f833e4166418\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.139735 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrx2\" (UniqueName: \"kubernetes.io/projected/20945b47-e70f-45b2-b137-9525a0ec1b31-kube-api-access-mqrx2\") pod \"ovn-operator-controller-manager-bbc5b68f9-xxntv\" (UID: \"20945b47-e70f-45b2-b137-9525a0ec1b31\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.173230 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.174109 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.174342 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.179252 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2rtpf" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.179807 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.187601 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.203742 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.212170 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzq2\" (UniqueName: \"kubernetes.io/projected/fb78ece6-180c-4237-8017-ec3087e0f47b-kube-api-access-ktzq2\") pod \"watcher-operator-controller-manager-6dd88c6f67-lkpnv\" (UID: \"fb78ece6-180c-4237-8017-ec3087e0f47b\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.212253 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88zr\" (UniqueName: \"kubernetes.io/projected/512df763-915b-447f-b5c3-0756788212d6-kube-api-access-d88zr\") pod \"placement-operator-controller-manager-574d45c66c-9ccwt\" (UID: \"512df763-915b-447f-b5c3-0756788212d6\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.212399 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv65f\" (UniqueName: \"kubernetes.io/projected/83ccb82d-2701-46e7-9aa9-3ed962bc31e0-kube-api-access-xv65f\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-dds4c\" (UID: \"83ccb82d-2701-46e7-9aa9-3ed962bc31e0\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.212430 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.212448 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gfh\" (UniqueName: \"kubernetes.io/projected/fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04-kube-api-access-s7gfh\") pod \"test-operator-controller-manager-5c5cb9c4d7-kl9vj\" (UID: \"fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.213459 4919 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.214369 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert podName:a197f90a-0c8f-47e6-ad18-f3c61cd51445 nodeName:}" failed. No retries permitted until 2026-03-10 22:09:10.21434657 +0000 UTC m=+1137.456227178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert") pod "infra-operator-controller-manager-5995f4446f-sxg86" (UID: "a197f90a-0c8f-47e6-ad18-f3c61cd51445") : secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.215901 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.233532 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88zr\" (UniqueName: \"kubernetes.io/projected/512df763-915b-447f-b5c3-0756788212d6-kube-api-access-d88zr\") pod \"placement-operator-controller-manager-574d45c66c-9ccwt\" (UID: \"512df763-915b-447f-b5c3-0756788212d6\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.240210 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv65f\" (UniqueName: \"kubernetes.io/projected/83ccb82d-2701-46e7-9aa9-3ed962bc31e0-kube-api-access-xv65f\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-dds4c\" (UID: \"83ccb82d-2701-46e7-9aa9-3ed962bc31e0\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.244612 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gfh\" (UniqueName: \"kubernetes.io/projected/fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04-kube-api-access-s7gfh\") pod \"test-operator-controller-manager-5c5cb9c4d7-kl9vj\" (UID: \"fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.257227 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.260129 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.263836 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-rzphv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.267751 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.314612 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzq2\" (UniqueName: \"kubernetes.io/projected/fb78ece6-180c-4237-8017-ec3087e0f47b-kube-api-access-ktzq2\") pod \"watcher-operator-controller-manager-6dd88c6f67-lkpnv\" (UID: \"fb78ece6-180c-4237-8017-ec3087e0f47b\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.314685 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhq6v\" (UniqueName: \"kubernetes.io/projected/fd276fb4-a047-472f-903a-8b343ec3894b-kube-api-access-bhq6v\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.314731 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.314855 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.336877 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzq2\" (UniqueName: \"kubernetes.io/projected/fb78ece6-180c-4237-8017-ec3087e0f47b-kube-api-access-ktzq2\") pod \"watcher-operator-controller-manager-6dd88c6f67-lkpnv\" (UID: \"fb78ece6-180c-4237-8017-ec3087e0f47b\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.375883 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.398210 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.418956 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.419551 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.419636 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhq6v\" (UniqueName: \"kubernetes.io/projected/fd276fb4-a047-472f-903a-8b343ec3894b-kube-api-access-bhq6v\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.419670 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llkdl\" (UniqueName: \"kubernetes.io/projected/a60700c9-46a9-4e84-9c13-afbb96d42f55-kube-api-access-llkdl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rmbn4\" (UID: \"a60700c9-46a9-4e84-9c13-afbb96d42f55\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.419689 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.419766 4919 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.419806 4919 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.419824 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:09.919806967 +0000 UTC m=+1137.161687575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.419854 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:09.919838048 +0000 UTC m=+1137.161718656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "metrics-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.448654 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhq6v\" (UniqueName: \"kubernetes.io/projected/fd276fb4-a047-472f-903a-8b343ec3894b-kube-api-access-bhq6v\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.457787 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.466570 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.492793 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.509650 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.509692 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.520881 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llkdl\" (UniqueName: \"kubernetes.io/projected/a60700c9-46a9-4e84-9c13-afbb96d42f55-kube-api-access-llkdl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rmbn4\" (UID: \"a60700c9-46a9-4e84-9c13-afbb96d42f55\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.526567 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" event={"ID":"c6a398e1-5a44-4601-98c8-9ac478b0502c","Type":"ContainerStarted","Data":"7705605bab2ff444607303a6c7d0dd8f43a9adcbb964b66168e0df99c0ae9edd"} Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.543096 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llkdl\" (UniqueName: \"kubernetes.io/projected/a60700c9-46a9-4e84-9c13-afbb96d42f55-kube-api-access-llkdl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-rmbn4\" (UID: \"a60700c9-46a9-4e84-9c13-afbb96d42f55\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.543277 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" Mar 10 22:09:09 crc kubenswrapper[4919]: W0310 22:09:09.574257 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3de5f2ad_6f5f_4e54_99ad_0d00736dfdab.slice/crio-f419d1f6adc55ad4eef56bd486a8f62fef63a2ebac74a96cb98984df221e04ed WatchSource:0}: Error finding container f419d1f6adc55ad4eef56bd486a8f62fef63a2ebac74a96cb98984df221e04ed: Status 404 returned error can't find the container with id f419d1f6adc55ad4eef56bd486a8f62fef63a2ebac74a96cb98984df221e04ed Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.607521 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.629655 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.629981 4919 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.630051 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert podName:768191d9-b4b4-44da-a525-b2ba92e1ceea nodeName:}" failed. No retries permitted until 2026-03-10 22:09:10.630029883 +0000 UTC m=+1137.871910571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" (UID: "768191d9-b4b4-44da-a525-b2ba92e1ceea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.734235 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.763933 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.779059 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.805568 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.863262 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn"] Mar 10 22:09:09 crc kubenswrapper[4919]: W0310 22:09:09.868597 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c849cfd_ade6_46ce_80f0_09df981cdafd.slice/crio-fe200215447ea22731baa2fcebf05886974635124e61faeb3f8049363db05c1a WatchSource:0}: Error finding container fe200215447ea22731baa2fcebf05886974635124e61faeb3f8049363db05c1a: Status 404 returned error can't find the container with id fe200215447ea22731baa2fcebf05886974635124e61faeb3f8049363db05c1a Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.870764 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj"] Mar 10 22:09:09 crc kubenswrapper[4919]: W0310 22:09:09.887764 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod200ec9b1_fcd0_4699_9002_7efdc5447a6d.slice/crio-d6e6a9d03d30895f36582314fa4f32e3e944996c686ef5ee0a3c3eb301044c40 WatchSource:0}: Error finding container d6e6a9d03d30895f36582314fa4f32e3e944996c686ef5ee0a3c3eb301044c40: Status 404 returned error can't find the container with id d6e6a9d03d30895f36582314fa4f32e3e944996c686ef5ee0a3c3eb301044c40 Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.928114 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7"] Mar 10 22:09:09 crc kubenswrapper[4919]: W0310 22:09:09.934178 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9571b0c_bd43_4789_942b_f833e4166418.slice/crio-b35f46cda76828e3ce0271ebf281c320c183a59eaaf0eb79ff5cfe3d5dd13c66 WatchSource:0}: Error finding container b35f46cda76828e3ce0271ebf281c320c183a59eaaf0eb79ff5cfe3d5dd13c66: Status 404 returned error can't find the container with id b35f46cda76828e3ce0271ebf281c320c183a59eaaf0eb79ff5cfe3d5dd13c66 Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.936672 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.936766 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.936900 4919 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.936949 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:10.936933098 +0000 UTC m=+1138.178813706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "webhook-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.936988 4919 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.937049 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:10.93702924 +0000 UTC m=+1138.178909888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "metrics-server-cert" not found Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.960018 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv"] Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.972253 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj"] Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.980917 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mqrx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-xxntv_openstack-operators(20945b47-e70f-45b2-b137-9525a0ec1b31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 22:09:09 crc kubenswrapper[4919]: I0310 22:09:09.981310 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv"] Mar 10 22:09:09 crc kubenswrapper[4919]: E0310 22:09:09.982111 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" podUID="20945b47-e70f-45b2-b137-9525a0ec1b31" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.091928 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl"] Mar 10 22:09:10 crc kubenswrapper[4919]: W0310 22:09:10.097058 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf180f9_e934_462e_926a_520b21f22550.slice/crio-d03c0f1210f576aacfe9774266b88d657176c615ed7e846cb9a3255214be69b1 WatchSource:0}: Error finding container d03c0f1210f576aacfe9774266b88d657176c615ed7e846cb9a3255214be69b1: Status 404 returned error can't find the container with id d03c0f1210f576aacfe9774266b88d657176c615ed7e846cb9a3255214be69b1 Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.172141 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt"] Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.182595 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c"] Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.194544 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-drwxk"] Mar 10 22:09:10 crc kubenswrapper[4919]: W0310 22:09:10.198244 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ccb82d_2701_46e7_9aa9_3ed962bc31e0.slice/crio-00bac32bd0e47d9ba967b73be71f21f87d75cf5e3e22e5d3cdf678681f0092c2 WatchSource:0}: Error finding container 00bac32bd0e47d9ba967b73be71f21f87d75cf5e3e22e5d3cdf678681f0092c2: Status 404 returned error can't find the container with id 00bac32bd0e47d9ba967b73be71f21f87d75cf5e3e22e5d3cdf678681f0092c2 Mar 10 22:09:10 crc kubenswrapper[4919]: W0310 22:09:10.198904 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod512df763_915b_447f_b5c3_0756788212d6.slice/crio-9b3325067684a75341a931cf15f688e094c1a7dafc244a9d54ffb0b439f85573 WatchSource:0}: Error finding container 9b3325067684a75341a931cf15f688e094c1a7dafc244a9d54ffb0b439f85573: Status 404 returned error can't find the container with id 9b3325067684a75341a931cf15f688e094c1a7dafc244a9d54ffb0b439f85573 Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.199832 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99xnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-drwxk_openstack-operators(2da3cd1a-93e3-4487-ab1f-b022662e06c0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.200579 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xv65f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-dds4c_openstack-operators(83ccb82d-2701-46e7-9aa9-3ed962bc31e0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.201689 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" podUID="83ccb82d-2701-46e7-9aa9-3ed962bc31e0" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.201693 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" podUID="2da3cd1a-93e3-4487-ab1f-b022662e06c0" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.202537 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d88zr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-9ccwt_openstack-operators(512df763-915b-447f-b5c3-0756788212d6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.203724 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" podUID="512df763-915b-447f-b5c3-0756788212d6" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.243316 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.243550 4919 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.243641 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert podName:a197f90a-0c8f-47e6-ad18-f3c61cd51445 nodeName:}" failed. No retries permitted until 2026-03-10 22:09:12.243618897 +0000 UTC m=+1139.485499595 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert") pod "infra-operator-controller-manager-5995f4446f-sxg86" (UID: "a197f90a-0c8f-47e6-ad18-f3c61cd51445") : secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.286080 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4"] Mar 10 22:09:10 crc kubenswrapper[4919]: W0310 22:09:10.289051 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca37d2a_51b1_4b60_a7e5_0ebfbf87fb04.slice/crio-767f79fd976951599970d4c3a137b6559b6c4565ab988d5948a476045fbd433a WatchSource:0}: Error finding container 767f79fd976951599970d4c3a137b6559b6c4565ab988d5948a476045fbd433a: Status 404 returned error can't find the container with id 767f79fd976951599970d4c3a137b6559b6c4565ab988d5948a476045fbd433a Mar 10 22:09:10 crc kubenswrapper[4919]: W0310 22:09:10.289942 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60700c9_46a9_4e84_9c13_afbb96d42f55.slice/crio-c3369ef55bbdf6d48fca86ade2f2f8906824061bb90c794d1924c027f38629e8 WatchSource:0}: Error finding container c3369ef55bbdf6d48fca86ade2f2f8906824061bb90c794d1924c027f38629e8: Status 404 returned error can't find the container with id c3369ef55bbdf6d48fca86ade2f2f8906824061bb90c794d1924c027f38629e8 Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.292204 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj"] Mar 10 22:09:10 crc kubenswrapper[4919]: W0310 22:09:10.292547 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb78ece6_180c_4237_8017_ec3087e0f47b.slice/crio-91c0b123eab17a97a399c83ec4ab9ecb5bbae52993517a8eb38a64c96d05a304 WatchSource:0}: Error finding container 91c0b123eab17a97a399c83ec4ab9ecb5bbae52993517a8eb38a64c96d05a304: Status 404 returned error can't find the container with id 91c0b123eab17a97a399c83ec4ab9ecb5bbae52993517a8eb38a64c96d05a304 Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.292774 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-llkdl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-rmbn4_openstack-operators(a60700c9-46a9-4e84-9c13-afbb96d42f55): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.293925 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" podUID="a60700c9-46a9-4e84-9c13-afbb96d42f55" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.297177 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ktzq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-lkpnv_openstack-operators(fb78ece6-180c-4237-8017-ec3087e0f47b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.297986 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv"] Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.298434 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" podUID="fb78ece6-180c-4237-8017-ec3087e0f47b" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.535186 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" event={"ID":"d6ac04fc-f3ea-4b69-aba1-b27490967c0e","Type":"ContainerStarted","Data":"e7f0283a22e16772f0ee2c7bf1149de22fa325b6c521de6d049a5b1f2e9c7cb7"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.536847 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" event={"ID":"ce86157b-1544-4db8-8e8c-20d1ec8dde0d","Type":"ContainerStarted","Data":"ca1ac95d8c9e18b20b3eb0dc9eb2b25d8e21047d293ddb57f71e44e75776626e"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.539738 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" event={"ID":"20945b47-e70f-45b2-b137-9525a0ec1b31","Type":"ContainerStarted","Data":"aee4a90b361150d94fd8a950a5b79eb3f2da7aa125f344e4a476c4eb68d021b6"} Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.541190 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" podUID="20945b47-e70f-45b2-b137-9525a0ec1b31" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.541611 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" event={"ID":"1b4083a5-cc88-4c92-9612-f06c0a36936d","Type":"ContainerStarted","Data":"02ca197b73ce78ae407e6633c48abc8622830558fcd8680b9184ecc16f4e2adc"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.547184 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" event={"ID":"7fb7e7ce-d1a7-41e2-876e-42808a70c9e2","Type":"ContainerStarted","Data":"4208c56827ac4f2507f73552c2f36223f67f8cc17181dbca6463e7f583963a3d"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.549007 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" event={"ID":"fb78ece6-180c-4237-8017-ec3087e0f47b","Type":"ContainerStarted","Data":"91c0b123eab17a97a399c83ec4ab9ecb5bbae52993517a8eb38a64c96d05a304"} Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.553117 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" podUID="fb78ece6-180c-4237-8017-ec3087e0f47b" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.553811 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" event={"ID":"512df763-915b-447f-b5c3-0756788212d6","Type":"ContainerStarted","Data":"9b3325067684a75341a931cf15f688e094c1a7dafc244a9d54ffb0b439f85573"} Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.555077 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" podUID="512df763-915b-447f-b5c3-0756788212d6" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.560416 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" event={"ID":"2da3cd1a-93e3-4487-ab1f-b022662e06c0","Type":"ContainerStarted","Data":"bcbbe5ac76c9043f37b0048d142229caf8938793ef74f3fe7908db208596fbc3"} Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.561711 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" podUID="2da3cd1a-93e3-4487-ab1f-b022662e06c0" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.563618 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" event={"ID":"588d49a0-7a32-4b7a-be73-ec3897d4653b","Type":"ContainerStarted","Data":"fd9e21d0ce6be8296828df28a2cc7d5239c6ec6466cc5df0cd9e44a301f5b74f"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.565170 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" event={"ID":"cbf180f9-e934-462e-926a-520b21f22550","Type":"ContainerStarted","Data":"d03c0f1210f576aacfe9774266b88d657176c615ed7e846cb9a3255214be69b1"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.568157 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" event={"ID":"8b54176b-b55a-43cd-9492-6f7d10b4e637","Type":"ContainerStarted","Data":"25d82e16486891dfb0bb7c3525079bd903fed02370e697a824889574ef5ec979"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.572808 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" event={"ID":"25bf7b22-52f2-40ef-bd19-efe9c061e8b8","Type":"ContainerStarted","Data":"74a1c4ea2ddaa5e4559fadac4b01426765c3a23a24760e64f38fc32c8be1dc77"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.576331 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" event={"ID":"a60700c9-46a9-4e84-9c13-afbb96d42f55","Type":"ContainerStarted","Data":"c3369ef55bbdf6d48fca86ade2f2f8906824061bb90c794d1924c027f38629e8"} Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.578538 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" podUID="a60700c9-46a9-4e84-9c13-afbb96d42f55" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.579579 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" event={"ID":"83ccb82d-2701-46e7-9aa9-3ed962bc31e0","Type":"ContainerStarted","Data":"00bac32bd0e47d9ba967b73be71f21f87d75cf5e3e22e5d3cdf678681f0092c2"} Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.582624 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" podUID="83ccb82d-2701-46e7-9aa9-3ed962bc31e0" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.595113 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" event={"ID":"3de5f2ad-6f5f-4e54-99ad-0d00736dfdab","Type":"ContainerStarted","Data":"f419d1f6adc55ad4eef56bd486a8f62fef63a2ebac74a96cb98984df221e04ed"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.600623 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" event={"ID":"7c849cfd-ade6-46ce-80f0-09df981cdafd","Type":"ContainerStarted","Data":"fe200215447ea22731baa2fcebf05886974635124e61faeb3f8049363db05c1a"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.604318 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" event={"ID":"fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04","Type":"ContainerStarted","Data":"767f79fd976951599970d4c3a137b6559b6c4565ab988d5948a476045fbd433a"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.607147 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" event={"ID":"d9571b0c-bd43-4789-942b-f833e4166418","Type":"ContainerStarted","Data":"b35f46cda76828e3ce0271ebf281c320c183a59eaaf0eb79ff5cfe3d5dd13c66"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.608311 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" event={"ID":"200ec9b1-fcd0-4699-9002-7efdc5447a6d","Type":"ContainerStarted","Data":"d6e6a9d03d30895f36582314fa4f32e3e944996c686ef5ee0a3c3eb301044c40"} Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.651933 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.652875 4919 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.652924 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert podName:768191d9-b4b4-44da-a525-b2ba92e1ceea nodeName:}" failed. No retries permitted until 2026-03-10 22:09:12.652910897 +0000 UTC m=+1139.894791505 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" (UID: "768191d9-b4b4-44da-a525-b2ba92e1ceea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.956848 4919 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.956940 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:12.956920613 +0000 UTC m=+1140.198801221 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "metrics-server-cert" not found Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.957230 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:10 crc kubenswrapper[4919]: I0310 22:09:10.957322 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.957465 4919 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 22:09:10 crc kubenswrapper[4919]: E0310 22:09:10.957492 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:12.957483468 +0000 UTC m=+1140.199364066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "webhook-server-cert" not found Mar 10 22:09:11 crc kubenswrapper[4919]: E0310 22:09:11.617484 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" podUID="2da3cd1a-93e3-4487-ab1f-b022662e06c0" Mar 10 22:09:11 crc kubenswrapper[4919]: E0310 22:09:11.617490 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" podUID="83ccb82d-2701-46e7-9aa9-3ed962bc31e0" Mar 10 22:09:11 crc kubenswrapper[4919]: E0310 22:09:11.617487 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" podUID="a60700c9-46a9-4e84-9c13-afbb96d42f55" Mar 10 22:09:11 crc kubenswrapper[4919]: E0310 22:09:11.619618 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" podUID="fb78ece6-180c-4237-8017-ec3087e0f47b" Mar 10 22:09:11 crc kubenswrapper[4919]: E0310 22:09:11.619904 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" podUID="512df763-915b-447f-b5c3-0756788212d6" Mar 10 22:09:11 crc kubenswrapper[4919]: E0310 22:09:11.619941 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" podUID="20945b47-e70f-45b2-b137-9525a0ec1b31" Mar 10 22:09:12 crc kubenswrapper[4919]: I0310 22:09:12.281211 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:12 crc kubenswrapper[4919]: E0310 22:09:12.281416 4919 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:12 crc kubenswrapper[4919]: E0310 22:09:12.281487 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert podName:a197f90a-0c8f-47e6-ad18-f3c61cd51445 nodeName:}" failed. No retries permitted until 2026-03-10 22:09:16.28146748 +0000 UTC m=+1143.523348088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert") pod "infra-operator-controller-manager-5995f4446f-sxg86" (UID: "a197f90a-0c8f-47e6-ad18-f3c61cd51445") : secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:12 crc kubenswrapper[4919]: I0310 22:09:12.688402 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:12 crc kubenswrapper[4919]: E0310 22:09:12.688613 4919 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:12 crc kubenswrapper[4919]: E0310 22:09:12.688658 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert podName:768191d9-b4b4-44da-a525-b2ba92e1ceea nodeName:}" failed. No retries permitted until 2026-03-10 22:09:16.688644101 +0000 UTC m=+1143.930524709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" (UID: "768191d9-b4b4-44da-a525-b2ba92e1ceea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:12 crc kubenswrapper[4919]: I0310 22:09:12.993404 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:12 crc kubenswrapper[4919]: I0310 22:09:12.993502 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:12 crc kubenswrapper[4919]: E0310 22:09:12.993620 4919 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 22:09:12 crc kubenswrapper[4919]: E0310 22:09:12.993663 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:16.993649625 +0000 UTC m=+1144.235530233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "metrics-server-cert" not found Mar 10 22:09:12 crc kubenswrapper[4919]: E0310 22:09:12.994001 4919 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 22:09:12 crc kubenswrapper[4919]: E0310 22:09:12.994087 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:16.994068407 +0000 UTC m=+1144.235949015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "webhook-server-cert" not found Mar 10 22:09:16 crc kubenswrapper[4919]: I0310 22:09:16.348731 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:16 crc kubenswrapper[4919]: E0310 22:09:16.348963 4919 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:16 crc kubenswrapper[4919]: E0310 22:09:16.349042 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert podName:a197f90a-0c8f-47e6-ad18-f3c61cd51445 nodeName:}" failed. No retries permitted until 2026-03-10 22:09:24.349024485 +0000 UTC m=+1151.590905093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert") pod "infra-operator-controller-manager-5995f4446f-sxg86" (UID: "a197f90a-0c8f-47e6-ad18-f3c61cd51445") : secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:16 crc kubenswrapper[4919]: I0310 22:09:16.753563 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:16 crc kubenswrapper[4919]: E0310 22:09:16.753742 4919 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:16 crc kubenswrapper[4919]: E0310 22:09:16.753887 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert podName:768191d9-b4b4-44da-a525-b2ba92e1ceea nodeName:}" failed. No retries permitted until 2026-03-10 22:09:24.753866694 +0000 UTC m=+1151.995747302 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" (UID: "768191d9-b4b4-44da-a525-b2ba92e1ceea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:17 crc kubenswrapper[4919]: I0310 22:09:17.057521 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:17 crc kubenswrapper[4919]: E0310 22:09:17.057691 4919 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 22:09:17 crc kubenswrapper[4919]: I0310 22:09:17.058012 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:17 crc kubenswrapper[4919]: E0310 22:09:17.058029 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:25.058011664 +0000 UTC m=+1152.299892272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "metrics-server-cert" not found Mar 10 22:09:17 crc kubenswrapper[4919]: E0310 22:09:17.058181 4919 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 22:09:17 crc kubenswrapper[4919]: E0310 22:09:17.058242 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:25.05822432 +0000 UTC m=+1152.300104938 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "webhook-server-cert" not found Mar 10 22:09:22 crc kubenswrapper[4919]: E0310 22:09:22.102782 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f" Mar 10 22:09:22 crc kubenswrapper[4919]: E0310 22:09:22.103244 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pgzv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6bbb499bbc-7gm62_openstack-operators(8b54176b-b55a-43cd-9492-6f7d10b4e637): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 22:09:22 crc kubenswrapper[4919]: E0310 22:09:22.104505 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" podUID="8b54176b-b55a-43cd-9492-6f7d10b4e637" Mar 10 22:09:22 crc kubenswrapper[4919]: E0310 22:09:22.595511 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 10 22:09:22 crc kubenswrapper[4919]: E0310 22:09:22.595694 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s7gfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-kl9vj_openstack-operators(fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 22:09:22 crc kubenswrapper[4919]: E0310 22:09:22.596889 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" podUID="fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04" Mar 10 22:09:22 crc kubenswrapper[4919]: E0310 22:09:22.690661 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" podUID="fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04" Mar 10 22:09:22 crc kubenswrapper[4919]: E0310 22:09:22.691179 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" podUID="8b54176b-b55a-43cd-9492-6f7d10b4e637" Mar 10 22:09:23 crc kubenswrapper[4919]: E0310 22:09:23.123811 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571" Mar 10 22:09:23 crc kubenswrapper[4919]: E0310 22:09:23.124329 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ll8c9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-7llx7_openstack-operators(d9571b0c-bd43-4789-942b-f833e4166418): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 22:09:23 crc kubenswrapper[4919]: E0310 22:09:23.126125 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" podUID="d9571b0c-bd43-4789-942b-f833e4166418" Mar 10 22:09:23 crc kubenswrapper[4919]: E0310 22:09:23.577064 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922" Mar 10 22:09:23 crc kubenswrapper[4919]: E0310 22:09:23.577213 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mgnzw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-569cc54c5-dk2nj_openstack-operators(d6ac04fc-f3ea-4b69-aba1-b27490967c0e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 22:09:23 crc kubenswrapper[4919]: E0310 22:09:23.578931 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" podUID="d6ac04fc-f3ea-4b69-aba1-b27490967c0e" Mar 10 22:09:23 crc kubenswrapper[4919]: E0310 22:09:23.697161 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" podUID="d9571b0c-bd43-4789-942b-f833e4166418" Mar 10 22:09:23 crc kubenswrapper[4919]: E0310 22:09:23.697512 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" podUID="d6ac04fc-f3ea-4b69-aba1-b27490967c0e" Mar 10 22:09:24 crc kubenswrapper[4919]: E0310 22:09:24.170435 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 10 22:09:24 crc kubenswrapper[4919]: E0310 22:09:24.170642 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fr9rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-b75jn_openstack-operators(200ec9b1-fcd0-4699-9002-7efdc5447a6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 22:09:24 crc kubenswrapper[4919]: E0310 22:09:24.171717 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" podUID="200ec9b1-fcd0-4699-9002-7efdc5447a6d" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.379870 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:24 crc kubenswrapper[4919]: E0310 22:09:24.380007 4919 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:24 crc kubenswrapper[4919]: E0310 22:09:24.380053 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert podName:a197f90a-0c8f-47e6-ad18-f3c61cd51445 nodeName:}" failed. No retries permitted until 2026-03-10 22:09:40.380040865 +0000 UTC m=+1167.621921473 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert") pod "infra-operator-controller-manager-5995f4446f-sxg86" (UID: "a197f90a-0c8f-47e6-ad18-f3c61cd51445") : secret "infra-operator-webhook-server-cert" not found Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.712518 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" event={"ID":"ce86157b-1544-4db8-8e8c-20d1ec8dde0d","Type":"ContainerStarted","Data":"335576e66081bdabe923bf6e69643682732837a23ac3f5a6ec38de38b1b6c501"} Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.712890 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.715954 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" event={"ID":"7c849cfd-ade6-46ce-80f0-09df981cdafd","Type":"ContainerStarted","Data":"f139061633e6888d06033cf734217267eb6a8f95830c13547397a27e120bec9c"} Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.716329 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.717809 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" event={"ID":"588d49a0-7a32-4b7a-be73-ec3897d4653b","Type":"ContainerStarted","Data":"f79f2408202aae3e75b9f042a8011b0678f842afcd843e2f275b4877a56e23c1"} Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.717951 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.718741 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" event={"ID":"25bf7b22-52f2-40ef-bd19-efe9c061e8b8","Type":"ContainerStarted","Data":"dd5ea7db397314581b04c2db74eca9e4df0ef00c129871be4dd1cf9a3672e607"} Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.718883 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.719646 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" event={"ID":"3de5f2ad-6f5f-4e54-99ad-0d00736dfdab","Type":"ContainerStarted","Data":"d12878c4a39ccbb2732afccde754d6899233e219060ad0b7461e66799bbc14a6"} Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.719741 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.720554 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" event={"ID":"c6a398e1-5a44-4601-98c8-9ac478b0502c","Type":"ContainerStarted","Data":"a59bd7f85e457b5ffe605cee66d129a200fffdd4f2aa4dc92a43d3d261ad0478"} Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.720688 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.721560 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" event={"ID":"cbf180f9-e934-462e-926a-520b21f22550","Type":"ContainerStarted","Data":"2284bdebc81adc40df15415d72b01d167a1abf1f45c7966aa56e1899c6575b37"} Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.721620 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.722425 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" event={"ID":"1b4083a5-cc88-4c92-9612-f06c0a36936d","Type":"ContainerStarted","Data":"7c3f94043f2af0cd0a4ac52b4dc58ca7a38247ded312b734ae400b54e7df713a"} Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.722534 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.723217 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" event={"ID":"7fb7e7ce-d1a7-41e2-876e-42808a70c9e2","Type":"ContainerStarted","Data":"f72e2741a623f23abc63aa2fa10b61ec0777f4d73bdd86dc62e1cb17376d7c44"} Mar 10 22:09:24 crc kubenswrapper[4919]: E0310 22:09:24.732986 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" podUID="200ec9b1-fcd0-4699-9002-7efdc5447a6d" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.733172 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" podStartSLOduration=2.415493053 podStartE2EDuration="16.733162542s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.823757191 +0000 UTC m=+1137.065637799" lastFinishedPulling="2026-03-10 22:09:24.14142668 +0000 UTC m=+1151.383307288" observedRunningTime="2026-03-10 22:09:24.732101443 +0000 UTC m=+1151.973982051" watchObservedRunningTime="2026-03-10 22:09:24.733162542 +0000 UTC m=+1151.975043150" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.754162 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" podStartSLOduration=2.592650974 podStartE2EDuration="16.754145711s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.978161635 +0000 UTC m=+1137.220042243" lastFinishedPulling="2026-03-10 22:09:24.139656372 +0000 UTC m=+1151.381536980" observedRunningTime="2026-03-10 22:09:24.750100051 +0000 UTC m=+1151.991980659" watchObservedRunningTime="2026-03-10 22:09:24.754145711 +0000 UTC m=+1151.996026319" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.772540 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" podStartSLOduration=2.505952106 podStartE2EDuration="16.772525599s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.873622813 +0000 UTC m=+1137.115503421" lastFinishedPulling="2026-03-10 22:09:24.140196306 +0000 UTC m=+1151.382076914" observedRunningTime="2026-03-10 22:09:24.770797582 +0000 UTC m=+1152.012678190" watchObservedRunningTime="2026-03-10 22:09:24.772525599 +0000 UTC m=+1152.014406207" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.786080 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:24 crc kubenswrapper[4919]: E0310 22:09:24.786219 4919 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:24 crc kubenswrapper[4919]: E0310 22:09:24.786315 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert podName:768191d9-b4b4-44da-a525-b2ba92e1ceea nodeName:}" failed. No retries permitted until 2026-03-10 22:09:40.786292012 +0000 UTC m=+1168.028172620 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" (UID: "768191d9-b4b4-44da-a525-b2ba92e1ceea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.796328 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" podStartSLOduration=2.242901478 podStartE2EDuration="16.796304123s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.587608894 +0000 UTC m=+1136.829489502" lastFinishedPulling="2026-03-10 22:09:24.141011539 +0000 UTC m=+1151.382892147" observedRunningTime="2026-03-10 22:09:24.790695641 +0000 UTC m=+1152.032576259" watchObservedRunningTime="2026-03-10 22:09:24.796304123 +0000 UTC m=+1152.038184731" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.833887 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" podStartSLOduration=2.794238236 podStartE2EDuration="16.833869951s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:10.100030317 +0000 UTC m=+1137.341910945" lastFinishedPulling="2026-03-10 22:09:24.139662052 +0000 UTC m=+1151.381542660" observedRunningTime="2026-03-10 22:09:24.833465619 +0000 UTC m=+1152.075346227" watchObservedRunningTime="2026-03-10 22:09:24.833869951 +0000 UTC m=+1152.075750559" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.857674 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" podStartSLOduration=2.54049659 podStartE2EDuration="16.857649435s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.823748971 +0000 UTC m=+1137.065629579" lastFinishedPulling="2026-03-10 22:09:24.140901816 +0000 UTC m=+1151.382782424" observedRunningTime="2026-03-10 22:09:24.851665042 +0000 UTC m=+1152.093545640" watchObservedRunningTime="2026-03-10 22:09:24.857649435 +0000 UTC m=+1152.099530043" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.864846 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" podStartSLOduration=2.163598489 podStartE2EDuration="16.86482535s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.440123427 +0000 UTC m=+1136.682004035" lastFinishedPulling="2026-03-10 22:09:24.141350288 +0000 UTC m=+1151.383230896" observedRunningTime="2026-03-10 22:09:24.862179398 +0000 UTC m=+1152.104060026" watchObservedRunningTime="2026-03-10 22:09:24.86482535 +0000 UTC m=+1152.106705968" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.885068 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" podStartSLOduration=2.314290731 podStartE2EDuration="16.885048057s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.568940817 +0000 UTC m=+1136.810821425" lastFinishedPulling="2026-03-10 22:09:24.139698143 +0000 UTC m=+1151.381578751" observedRunningTime="2026-03-10 22:09:24.879563838 +0000 UTC m=+1152.121444466" watchObservedRunningTime="2026-03-10 22:09:24.885048057 +0000 UTC m=+1152.126928665" Mar 10 22:09:24 crc kubenswrapper[4919]: I0310 22:09:24.905806 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" podStartSLOduration=2.585391677 podStartE2EDuration="16.905783089s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.820984556 +0000 UTC m=+1137.062865164" lastFinishedPulling="2026-03-10 22:09:24.141375968 +0000 UTC m=+1151.383256576" observedRunningTime="2026-03-10 22:09:24.901286077 +0000 UTC m=+1152.143166685" watchObservedRunningTime="2026-03-10 22:09:24.905783089 +0000 UTC m=+1152.147663697" Mar 10 22:09:25 crc kubenswrapper[4919]: I0310 22:09:25.090491 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:25 crc kubenswrapper[4919]: I0310 22:09:25.090595 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:25 crc kubenswrapper[4919]: E0310 22:09:25.090659 4919 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 22:09:25 crc kubenswrapper[4919]: E0310 22:09:25.090732 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:41.090711239 +0000 UTC m=+1168.332591847 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "webhook-server-cert" not found Mar 10 22:09:25 crc kubenswrapper[4919]: E0310 22:09:25.090788 4919 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 22:09:25 crc kubenswrapper[4919]: E0310 22:09:25.090867 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs podName:fd276fb4-a047-472f-903a-8b343ec3894b nodeName:}" failed. No retries permitted until 2026-03-10 22:09:41.090849103 +0000 UTC m=+1168.332729711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-pnxdc" (UID: "fd276fb4-a047-472f-903a-8b343ec3894b") : secret "metrics-server-cert" not found Mar 10 22:09:25 crc kubenswrapper[4919]: I0310 22:09:25.730227 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" Mar 10 22:09:27 crc kubenswrapper[4919]: I0310 22:09:27.747279 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" event={"ID":"512df763-915b-447f-b5c3-0756788212d6","Type":"ContainerStarted","Data":"1f8bc24c4c3eb7b7c9c89d60f24e13b17289882101385f2f02ede3a93f3f0c6e"} Mar 10 22:09:27 crc kubenswrapper[4919]: I0310 22:09:27.748413 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" Mar 10 22:09:27 crc kubenswrapper[4919]: I0310 22:09:27.766750 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" podStartSLOduration=3.30190998 podStartE2EDuration="19.766732092s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:10.202406111 +0000 UTC m=+1137.444286719" lastFinishedPulling="2026-03-10 22:09:26.667228223 +0000 UTC m=+1153.909108831" observedRunningTime="2026-03-10 22:09:27.763892975 +0000 UTC m=+1155.005773583" watchObservedRunningTime="2026-03-10 22:09:27.766732092 +0000 UTC m=+1155.008612700" Mar 10 22:09:29 crc kubenswrapper[4919]: I0310 22:09:29.099680 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-7fkgj" Mar 10 22:09:29 crc kubenswrapper[4919]: I0310 22:09:29.125738 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-tvmsl" Mar 10 22:09:29 crc kubenswrapper[4919]: I0310 22:09:29.134601 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-fjrhv" Mar 10 22:09:29 crc kubenswrapper[4919]: I0310 22:09:29.175951 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:09:29 crc kubenswrapper[4919]: I0310 22:09:29.176005 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.786259 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" event={"ID":"83ccb82d-2701-46e7-9aa9-3ed962bc31e0","Type":"ContainerStarted","Data":"5e285c0e4b7af20784fd36695af739d1c6c34ff62187d9c20a2061128ff0c7b6"} Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.786910 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.788606 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" event={"ID":"fb78ece6-180c-4237-8017-ec3087e0f47b","Type":"ContainerStarted","Data":"2436bf2c371bea7abbe300a333af5556306da08a502b974a05d180a3b9604454"} Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.788745 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.790311 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" event={"ID":"20945b47-e70f-45b2-b137-9525a0ec1b31","Type":"ContainerStarted","Data":"5e7f1a1dc3eda63fed33a28e1a7c0014fe4354acc29df92e1e7f8e807f2fe341"} Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.790426 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.791909 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" event={"ID":"2da3cd1a-93e3-4487-ab1f-b022662e06c0","Type":"ContainerStarted","Data":"f94d878502c23c45ae4dc87b84890f08a8ecb9019a8ffeb63328a6fe99ddcb1a"} Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.792124 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.800586 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" event={"ID":"a60700c9-46a9-4e84-9c13-afbb96d42f55","Type":"ContainerStarted","Data":"d2938892c87db073cdcb00c3ada890d7b863e3bb4105f51b9dfe307a0b58f191"} Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.811251 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" podStartSLOduration=3.199273299 podStartE2EDuration="24.811239257s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:10.200482128 +0000 UTC m=+1137.442362736" lastFinishedPulling="2026-03-10 22:09:31.812448086 +0000 UTC m=+1159.054328694" observedRunningTime="2026-03-10 22:09:32.810439016 +0000 UTC m=+1160.052319624" watchObservedRunningTime="2026-03-10 22:09:32.811239257 +0000 UTC m=+1160.053119865" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.831889 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" podStartSLOduration=3.317552784 podStartE2EDuration="24.831874356s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:10.297017884 +0000 UTC m=+1137.538898492" lastFinishedPulling="2026-03-10 22:09:31.811339456 +0000 UTC m=+1159.053220064" observedRunningTime="2026-03-10 22:09:32.827453706 +0000 UTC m=+1160.069334314" watchObservedRunningTime="2026-03-10 22:09:32.831874356 +0000 UTC m=+1160.073754964" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.871892 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-rmbn4" podStartSLOduration=2.267080107 podStartE2EDuration="23.87184806s" podCreationTimestamp="2026-03-10 22:09:09 +0000 UTC" firstStartedPulling="2026-03-10 22:09:10.292640125 +0000 UTC m=+1137.534520743" lastFinishedPulling="2026-03-10 22:09:31.897408088 +0000 UTC m=+1159.139288696" observedRunningTime="2026-03-10 22:09:32.859191457 +0000 UTC m=+1160.101072065" watchObservedRunningTime="2026-03-10 22:09:32.87184806 +0000 UTC m=+1160.113728668" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.873601 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" podStartSLOduration=3.261205807 podStartE2EDuration="24.873581216s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:10.199677497 +0000 UTC m=+1137.441558115" lastFinishedPulling="2026-03-10 22:09:31.812052906 +0000 UTC m=+1159.053933524" observedRunningTime="2026-03-10 22:09:32.841776594 +0000 UTC m=+1160.083657202" watchObservedRunningTime="2026-03-10 22:09:32.873581216 +0000 UTC m=+1160.115461824" Mar 10 22:09:32 crc kubenswrapper[4919]: I0310 22:09:32.896876 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" podStartSLOduration=3.065761431 podStartE2EDuration="24.896855746s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.980706873 +0000 UTC m=+1137.222587481" lastFinishedPulling="2026-03-10 22:09:31.811801188 +0000 UTC m=+1159.053681796" observedRunningTime="2026-03-10 22:09:32.895705446 +0000 UTC m=+1160.137586054" watchObservedRunningTime="2026-03-10 22:09:32.896855746 +0000 UTC m=+1160.138736374" Mar 10 22:09:37 crc kubenswrapper[4919]: I0310 22:09:37.836914 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" event={"ID":"200ec9b1-fcd0-4699-9002-7efdc5447a6d","Type":"ContainerStarted","Data":"2aa54a94b695bc4adf017323467f68d2096c2beaa1e49905c000dfbe5e40b19a"} Mar 10 22:09:37 crc kubenswrapper[4919]: I0310 22:09:37.838857 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" Mar 10 22:09:37 crc kubenswrapper[4919]: I0310 22:09:37.864815 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" podStartSLOduration=2.856347408 podStartE2EDuration="29.864789367s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.889565964 +0000 UTC m=+1137.131446572" lastFinishedPulling="2026-03-10 22:09:36.898007923 +0000 UTC m=+1164.139888531" observedRunningTime="2026-03-10 22:09:37.863314117 +0000 UTC m=+1165.105194725" watchObservedRunningTime="2026-03-10 22:09:37.864789367 +0000 UTC m=+1165.106670005" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.573822 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-k9qtp" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.629588 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tw8tv" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.680709 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-4bbgm" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.696177 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jgr2" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.740353 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-tsrgx" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.844737 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" event={"ID":"d9571b0c-bd43-4789-942b-f833e4166418","Type":"ContainerStarted","Data":"c6c38dfc282ace669112a19879b8f0b309901798a606857b42a50af68779a9d7"} Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.844966 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.845887 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" event={"ID":"8b54176b-b55a-43cd-9492-6f7d10b4e637","Type":"ContainerStarted","Data":"3db7eb4144c7472256bfa6af78a129fa3bd7b660ef1fdceaf8015b285c2823ce"} Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.846194 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.863633 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" podStartSLOduration=2.696952989 podStartE2EDuration="30.863611248s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.936596579 +0000 UTC m=+1137.178477187" lastFinishedPulling="2026-03-10 22:09:38.103254838 +0000 UTC m=+1165.345135446" observedRunningTime="2026-03-10 22:09:38.858177871 +0000 UTC m=+1166.100058479" watchObservedRunningTime="2026-03-10 22:09:38.863611248 +0000 UTC m=+1166.105491856" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.876529 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" podStartSLOduration=2.5361156019999997 podStartE2EDuration="30.876507387s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.762021969 +0000 UTC m=+1137.003902587" lastFinishedPulling="2026-03-10 22:09:38.102413764 +0000 UTC m=+1165.344294372" observedRunningTime="2026-03-10 22:09:38.871816911 +0000 UTC m=+1166.113697519" watchObservedRunningTime="2026-03-10 22:09:38.876507387 +0000 UTC m=+1166.118388005" Mar 10 22:09:38 crc kubenswrapper[4919]: I0310 22:09:38.902311 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-tm5dc" Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.381915 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xxntv" Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.421809 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-drwxk" Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.459849 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-9ccwt" Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.475466 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dds4c" Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.548902 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-lkpnv" Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.852510 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" event={"ID":"d6ac04fc-f3ea-4b69-aba1-b27490967c0e","Type":"ContainerStarted","Data":"ea625ec660dfef004bce709b3631e9b7f58b9da603b8be3f0dbd52d82c025359"} Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.852701 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.854265 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" event={"ID":"fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04","Type":"ContainerStarted","Data":"7a948b0f675cae91d42d2aefdc86e2af24c6cd23a57f70155ce72cf24a855551"} Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.867082 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" podStartSLOduration=2.818784511 podStartE2EDuration="31.867059156s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:09.978863814 +0000 UTC m=+1137.220744422" lastFinishedPulling="2026-03-10 22:09:39.027138459 +0000 UTC m=+1166.269019067" observedRunningTime="2026-03-10 22:09:39.865181584 +0000 UTC m=+1167.107062192" watchObservedRunningTime="2026-03-10 22:09:39.867059156 +0000 UTC m=+1167.108939784" Mar 10 22:09:39 crc kubenswrapper[4919]: I0310 22:09:39.892083 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" podStartSLOduration=3.127161504 podStartE2EDuration="31.892059432s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:10.291588936 +0000 UTC m=+1137.533469544" lastFinishedPulling="2026-03-10 22:09:39.056486864 +0000 UTC m=+1166.298367472" observedRunningTime="2026-03-10 22:09:39.876168652 +0000 UTC m=+1167.118049270" watchObservedRunningTime="2026-03-10 22:09:39.892059432 +0000 UTC m=+1167.133940050" Mar 10 22:09:40 crc kubenswrapper[4919]: I0310 22:09:40.471489 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:40 crc kubenswrapper[4919]: I0310 22:09:40.491003 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a197f90a-0c8f-47e6-ad18-f3c61cd51445-cert\") pod \"infra-operator-controller-manager-5995f4446f-sxg86\" (UID: \"a197f90a-0c8f-47e6-ad18-f3c61cd51445\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:40 crc kubenswrapper[4919]: I0310 22:09:40.640133 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7s57n" Mar 10 22:09:40 crc kubenswrapper[4919]: I0310 22:09:40.648908 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:40 crc kubenswrapper[4919]: I0310 22:09:40.885576 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:40 crc kubenswrapper[4919]: I0310 22:09:40.892235 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/768191d9-b4b4-44da-a525-b2ba92e1ceea-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f8bgkq\" (UID: \"768191d9-b4b4-44da-a525-b2ba92e1ceea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:40 crc kubenswrapper[4919]: I0310 22:09:40.905266 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cbxv2" Mar 10 22:09:40 crc kubenswrapper[4919]: I0310 22:09:40.914642 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.099901 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86"] Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.190339 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.190491 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.195249 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.195335 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd276fb4-a047-472f-903a-8b343ec3894b-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-pnxdc\" (UID: \"fd276fb4-a047-472f-903a-8b343ec3894b\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.348574 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq"] Mar 10 22:09:41 crc kubenswrapper[4919]: W0310 22:09:41.352625 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod768191d9_b4b4_44da_a525_b2ba92e1ceea.slice/crio-29fab0b3ec06ce0027d60bc9297f8fe9c5ede6d135a0976df7e790248ca8f4fb WatchSource:0}: Error finding container 29fab0b3ec06ce0027d60bc9297f8fe9c5ede6d135a0976df7e790248ca8f4fb: Status 404 returned error can't find the container with id 29fab0b3ec06ce0027d60bc9297f8fe9c5ede6d135a0976df7e790248ca8f4fb Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.399262 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2rtpf" Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.408421 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.847445 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc"] Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.872081 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" event={"ID":"fd276fb4-a047-472f-903a-8b343ec3894b","Type":"ContainerStarted","Data":"0b62a12150060a044a060d2beb029e91e660fa3da36db6d8a5779a71c6a3b9fc"} Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.873528 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" event={"ID":"768191d9-b4b4-44da-a525-b2ba92e1ceea","Type":"ContainerStarted","Data":"29fab0b3ec06ce0027d60bc9297f8fe9c5ede6d135a0976df7e790248ca8f4fb"} Mar 10 22:09:41 crc kubenswrapper[4919]: I0310 22:09:41.875471 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" event={"ID":"a197f90a-0c8f-47e6-ad18-f3c61cd51445","Type":"ContainerStarted","Data":"d296ed5f167a1cf4cfb4b669c281096c6f945d708935f9631a5bebac43d9ddc2"} Mar 10 22:09:46 crc kubenswrapper[4919]: I0310 22:09:46.906440 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" event={"ID":"fd276fb4-a047-472f-903a-8b343ec3894b","Type":"ContainerStarted","Data":"9ae0cf40b0ea502fca74ff91d466a8593584c2769a774d4611d2ed59b1ab7403"} Mar 10 22:09:47 crc kubenswrapper[4919]: I0310 22:09:47.914028 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:47 crc kubenswrapper[4919]: I0310 22:09:47.940337 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" podStartSLOduration=39.94031733 podStartE2EDuration="39.94031733s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:09:47.936781444 +0000 UTC m=+1175.178662062" watchObservedRunningTime="2026-03-10 22:09:47.94031733 +0000 UTC m=+1175.182197938" Mar 10 22:09:48 crc kubenswrapper[4919]: I0310 22:09:48.916937 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7gm62" Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.034213 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-b75jn" Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.180928 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dk2nj" Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.218991 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7llx7" Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.493983 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.496039 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kl9vj" Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.926062 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" event={"ID":"768191d9-b4b4-44da-a525-b2ba92e1ceea","Type":"ContainerStarted","Data":"205ff4f8f02fa1d94c9d99c75ddee3824195063f2f7c4359669d325840516a46"} Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.926916 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.937302 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" event={"ID":"a197f90a-0c8f-47e6-ad18-f3c61cd51445","Type":"ContainerStarted","Data":"7a5ec4da3bb3362a74cc75982dbdc07eee9be5423523f4f5a5ed19a3d9ca27bc"} Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.964330 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" podStartSLOduration=33.891238338 podStartE2EDuration="41.964304777s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:41.355567005 +0000 UTC m=+1168.597447613" lastFinishedPulling="2026-03-10 22:09:49.428633444 +0000 UTC m=+1176.670514052" observedRunningTime="2026-03-10 22:09:49.957575234 +0000 UTC m=+1177.199455862" watchObservedRunningTime="2026-03-10 22:09:49.964304777 +0000 UTC m=+1177.206185385" Mar 10 22:09:49 crc kubenswrapper[4919]: I0310 22:09:49.982740 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" podStartSLOduration=33.610514992 podStartE2EDuration="41.982714075s" podCreationTimestamp="2026-03-10 22:09:08 +0000 UTC" firstStartedPulling="2026-03-10 22:09:41.107812642 +0000 UTC m=+1168.349693260" lastFinishedPulling="2026-03-10 22:09:49.480011735 +0000 UTC m=+1176.721892343" observedRunningTime="2026-03-10 22:09:49.973822085 +0000 UTC m=+1177.215702693" watchObservedRunningTime="2026-03-10 22:09:49.982714075 +0000 UTC m=+1177.224594683" Mar 10 22:09:50 crc kubenswrapper[4919]: I0310 22:09:50.650139 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:09:51 crc kubenswrapper[4919]: I0310 22:09:51.417040 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-pnxdc" Mar 10 22:09:59 crc kubenswrapper[4919]: I0310 22:09:59.175920 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:09:59 crc kubenswrapper[4919]: I0310 22:09:59.176660 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:09:59 crc kubenswrapper[4919]: I0310 22:09:59.176735 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:09:59 crc kubenswrapper[4919]: I0310 22:09:59.177675 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe6790b4b646495ea90afaa8908c36e512ca4c07ed60f10561e041c0f1b0c857"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:09:59 crc kubenswrapper[4919]: I0310 22:09:59.177823 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://fe6790b4b646495ea90afaa8908c36e512ca4c07ed60f10561e041c0f1b0c857" gracePeriod=600 Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.012808 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="fe6790b4b646495ea90afaa8908c36e512ca4c07ed60f10561e041c0f1b0c857" exitCode=0 Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.012917 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"fe6790b4b646495ea90afaa8908c36e512ca4c07ed60f10561e041c0f1b0c857"} Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.013369 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"1dccae4c12e9eba18bc8d7756e50538a70d75c0bc02ce7c79c284d496783301e"} Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.013452 4919 scope.go:117] "RemoveContainer" containerID="5c0f64b8b2ef3b8561ca8ab7ca9e89321df88a87f472fe3592188e0b92020ed2" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.149275 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553010-qj5k8"] Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.151095 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553010-qj5k8" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.153447 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.153561 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.154456 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.159350 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553010-qj5k8"] Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.168899 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnrdq\" (UniqueName: \"kubernetes.io/projected/a8340b4f-e490-40e1-bb64-103f5fe20225-kube-api-access-wnrdq\") pod \"auto-csr-approver-29553010-qj5k8\" (UID: \"a8340b4f-e490-40e1-bb64-103f5fe20225\") " pod="openshift-infra/auto-csr-approver-29553010-qj5k8" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.270443 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnrdq\" (UniqueName: \"kubernetes.io/projected/a8340b4f-e490-40e1-bb64-103f5fe20225-kube-api-access-wnrdq\") pod \"auto-csr-approver-29553010-qj5k8\" (UID: \"a8340b4f-e490-40e1-bb64-103f5fe20225\") " pod="openshift-infra/auto-csr-approver-29553010-qj5k8" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.297990 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnrdq\" (UniqueName: \"kubernetes.io/projected/a8340b4f-e490-40e1-bb64-103f5fe20225-kube-api-access-wnrdq\") pod \"auto-csr-approver-29553010-qj5k8\" (UID: \"a8340b4f-e490-40e1-bb64-103f5fe20225\") " pod="openshift-infra/auto-csr-approver-29553010-qj5k8" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.478204 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553010-qj5k8" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.655688 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-sxg86" Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.922690 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553010-qj5k8"] Mar 10 22:10:00 crc kubenswrapper[4919]: I0310 22:10:00.947385 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f8bgkq" Mar 10 22:10:01 crc kubenswrapper[4919]: I0310 22:10:01.019441 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553010-qj5k8" event={"ID":"a8340b4f-e490-40e1-bb64-103f5fe20225","Type":"ContainerStarted","Data":"5722f17285bd157b9c9b9b49cef3e7332c00294c5ee43fea219569495b9139e4"} Mar 10 22:10:03 crc kubenswrapper[4919]: I0310 22:10:03.038289 4919 generic.go:334] "Generic (PLEG): container finished" podID="a8340b4f-e490-40e1-bb64-103f5fe20225" containerID="2dd8adb46bd856ff5f07b26aebb46c78dd38855252e1e7b88749967541cb88e9" exitCode=0 Mar 10 22:10:03 crc kubenswrapper[4919]: I0310 22:10:03.038556 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553010-qj5k8" event={"ID":"a8340b4f-e490-40e1-bb64-103f5fe20225","Type":"ContainerDied","Data":"2dd8adb46bd856ff5f07b26aebb46c78dd38855252e1e7b88749967541cb88e9"} Mar 10 22:10:04 crc kubenswrapper[4919]: I0310 22:10:04.362489 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553010-qj5k8" Mar 10 22:10:04 crc kubenswrapper[4919]: I0310 22:10:04.442687 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnrdq\" (UniqueName: \"kubernetes.io/projected/a8340b4f-e490-40e1-bb64-103f5fe20225-kube-api-access-wnrdq\") pod \"a8340b4f-e490-40e1-bb64-103f5fe20225\" (UID: \"a8340b4f-e490-40e1-bb64-103f5fe20225\") " Mar 10 22:10:04 crc kubenswrapper[4919]: I0310 22:10:04.449309 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8340b4f-e490-40e1-bb64-103f5fe20225-kube-api-access-wnrdq" (OuterVolumeSpecName: "kube-api-access-wnrdq") pod "a8340b4f-e490-40e1-bb64-103f5fe20225" (UID: "a8340b4f-e490-40e1-bb64-103f5fe20225"). InnerVolumeSpecName "kube-api-access-wnrdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:10:04 crc kubenswrapper[4919]: I0310 22:10:04.552301 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnrdq\" (UniqueName: \"kubernetes.io/projected/a8340b4f-e490-40e1-bb64-103f5fe20225-kube-api-access-wnrdq\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:05 crc kubenswrapper[4919]: I0310 22:10:05.059742 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553010-qj5k8" event={"ID":"a8340b4f-e490-40e1-bb64-103f5fe20225","Type":"ContainerDied","Data":"5722f17285bd157b9c9b9b49cef3e7332c00294c5ee43fea219569495b9139e4"} Mar 10 22:10:05 crc kubenswrapper[4919]: I0310 22:10:05.060130 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5722f17285bd157b9c9b9b49cef3e7332c00294c5ee43fea219569495b9139e4" Mar 10 22:10:05 crc kubenswrapper[4919]: I0310 22:10:05.059826 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553010-qj5k8" Mar 10 22:10:05 crc kubenswrapper[4919]: I0310 22:10:05.431183 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553004-2n2t4"] Mar 10 22:10:05 crc kubenswrapper[4919]: I0310 22:10:05.437884 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553004-2n2t4"] Mar 10 22:10:05 crc kubenswrapper[4919]: I0310 22:10:05.489526 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="572d0cfe-f016-44c7-baa9-83166b19a691" path="/var/lib/kubelet/pods/572d0cfe-f016-44c7-baa9-83166b19a691/volumes" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.574206 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-tbhrn"] Mar 10 22:10:16 crc kubenswrapper[4919]: E0310 22:10:16.575065 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8340b4f-e490-40e1-bb64-103f5fe20225" containerName="oc" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.575084 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8340b4f-e490-40e1-bb64-103f5fe20225" containerName="oc" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.575275 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8340b4f-e490-40e1-bb64-103f5fe20225" containerName="oc" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.578433 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.580432 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.581652 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.581706 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.581798 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ds65k" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.593880 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-tbhrn"] Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.622078 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e8d845-c249-45ce-9182-3eca0f0e98f6-config\") pod \"dnsmasq-dns-589db6c89c-tbhrn\" (UID: \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\") " pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.622136 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxd89\" (UniqueName: \"kubernetes.io/projected/a3e8d845-c249-45ce-9182-3eca0f0e98f6-kube-api-access-gxd89\") pod \"dnsmasq-dns-589db6c89c-tbhrn\" (UID: \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\") " pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.643005 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4hc2l"] Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.644373 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.648891 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.663274 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4hc2l"] Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.723522 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-config\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.723564 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vkfb\" (UniqueName: \"kubernetes.io/projected/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-kube-api-access-5vkfb\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.723594 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxd89\" (UniqueName: \"kubernetes.io/projected/a3e8d845-c249-45ce-9182-3eca0f0e98f6-kube-api-access-gxd89\") pod \"dnsmasq-dns-589db6c89c-tbhrn\" (UID: \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\") " pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.723710 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.723862 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e8d845-c249-45ce-9182-3eca0f0e98f6-config\") pod \"dnsmasq-dns-589db6c89c-tbhrn\" (UID: \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\") " pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.725103 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e8d845-c249-45ce-9182-3eca0f0e98f6-config\") pod \"dnsmasq-dns-589db6c89c-tbhrn\" (UID: \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\") " pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.740868 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxd89\" (UniqueName: \"kubernetes.io/projected/a3e8d845-c249-45ce-9182-3eca0f0e98f6-kube-api-access-gxd89\") pod \"dnsmasq-dns-589db6c89c-tbhrn\" (UID: \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\") " pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.825721 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.825848 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-config\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.825879 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vkfb\" (UniqueName: \"kubernetes.io/projected/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-kube-api-access-5vkfb\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.826670 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.826810 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-config\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.841972 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vkfb\" (UniqueName: \"kubernetes.io/projected/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-kube-api-access-5vkfb\") pod \"dnsmasq-dns-86bbd886cf-4hc2l\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.900897 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:16 crc kubenswrapper[4919]: I0310 22:10:16.966094 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:17 crc kubenswrapper[4919]: W0310 22:10:17.368757 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3e8d845_c249_45ce_9182_3eca0f0e98f6.slice/crio-cc5993ee0e3ed564c557fb69f3eaf0e0d97c45f6a4f999adb5073b021b1e65f2 WatchSource:0}: Error finding container cc5993ee0e3ed564c557fb69f3eaf0e0d97c45f6a4f999adb5073b021b1e65f2: Status 404 returned error can't find the container with id cc5993ee0e3ed564c557fb69f3eaf0e0d97c45f6a4f999adb5073b021b1e65f2 Mar 10 22:10:17 crc kubenswrapper[4919]: I0310 22:10:17.370198 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-tbhrn"] Mar 10 22:10:17 crc kubenswrapper[4919]: I0310 22:10:17.371092 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:10:17 crc kubenswrapper[4919]: W0310 22:10:17.426790 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb139c9e1_5c8d_4bfa_a222_c1538328e8c9.slice/crio-0a3b99714a84830a629dd46381de7e487361a88601a2ef8aea9b68d7a4ecbe7d WatchSource:0}: Error finding container 0a3b99714a84830a629dd46381de7e487361a88601a2ef8aea9b68d7a4ecbe7d: Status 404 returned error can't find the container with id 0a3b99714a84830a629dd46381de7e487361a88601a2ef8aea9b68d7a4ecbe7d Mar 10 22:10:17 crc kubenswrapper[4919]: I0310 22:10:17.431113 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4hc2l"] Mar 10 22:10:18 crc kubenswrapper[4919]: I0310 22:10:18.161680 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" event={"ID":"a3e8d845-c249-45ce-9182-3eca0f0e98f6","Type":"ContainerStarted","Data":"cc5993ee0e3ed564c557fb69f3eaf0e0d97c45f6a4f999adb5073b021b1e65f2"} Mar 10 22:10:18 crc kubenswrapper[4919]: I0310 22:10:18.163443 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" event={"ID":"b139c9e1-5c8d-4bfa-a222-c1538328e8c9","Type":"ContainerStarted","Data":"0a3b99714a84830a629dd46381de7e487361a88601a2ef8aea9b68d7a4ecbe7d"} Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.173747 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-tbhrn"] Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.198231 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-9mcfd"] Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.199617 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.210105 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-9mcfd"] Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.258737 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-config\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.259065 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjlgk\" (UniqueName: \"kubernetes.io/projected/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-kube-api-access-cjlgk\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.259138 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.359929 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjlgk\" (UniqueName: \"kubernetes.io/projected/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-kube-api-access-cjlgk\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.360024 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.360061 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-config\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.360952 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.360963 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-config\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.388013 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjlgk\" (UniqueName: \"kubernetes.io/projected/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-kube-api-access-cjlgk\") pod \"dnsmasq-dns-78cb4465c9-9mcfd\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.436499 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4hc2l"] Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.469674 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-jmrlb"] Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.481365 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.524660 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-jmrlb"] Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.531139 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.564267 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.564348 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xslp5\" (UniqueName: \"kubernetes.io/projected/4a25dadc-79b5-4535-a9b2-92a9b184119c-kube-api-access-xslp5\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.564462 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-config\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.665468 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-config\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.665533 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.665562 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xslp5\" (UniqueName: \"kubernetes.io/projected/4a25dadc-79b5-4535-a9b2-92a9b184119c-kube-api-access-xslp5\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.667017 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.667314 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-config\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.693166 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xslp5\" (UniqueName: \"kubernetes.io/projected/4a25dadc-79b5-4535-a9b2-92a9b184119c-kube-api-access-xslp5\") pod \"dnsmasq-dns-7c47bcb9f9-jmrlb\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:19 crc kubenswrapper[4919]: I0310 22:10:19.812105 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.023097 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-9mcfd"] Mar 10 22:10:20 crc kubenswrapper[4919]: W0310 22:10:20.044442 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3e7e9b6_1fd1_4fff_a8a2_2991997c8e91.slice/crio-56c0efb538fc45caec58c062843e8358cb5f0f5e822a31f9130e7aa0a92df118 WatchSource:0}: Error finding container 56c0efb538fc45caec58c062843e8358cb5f0f5e822a31f9130e7aa0a92df118: Status 404 returned error can't find the container with id 56c0efb538fc45caec58c062843e8358cb5f0f5e822a31f9130e7aa0a92df118 Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.184144 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" event={"ID":"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91","Type":"ContainerStarted","Data":"56c0efb538fc45caec58c062843e8358cb5f0f5e822a31f9130e7aa0a92df118"} Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.295693 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-jmrlb"] Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.340914 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.342105 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.345654 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.347068 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.347328 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.347471 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.347642 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hx9cz" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.347778 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.347934 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.348081 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.375766 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54x5x\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-kube-api-access-54x5x\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376140 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376174 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376195 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376307 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376358 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376413 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376456 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fe05756-9202-4514-8eea-0c786a2b6d56-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376482 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376546 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.376611 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fe05756-9202-4514-8eea-0c786a2b6d56-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.477793 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.477864 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fe05756-9202-4514-8eea-0c786a2b6d56-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.477891 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54x5x\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-kube-api-access-54x5x\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.477922 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.477953 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.477971 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.477995 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.478016 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.478034 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.478061 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fe05756-9202-4514-8eea-0c786a2b6d56-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.478085 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.478495 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.479284 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.479567 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.479857 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.480222 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.480452 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.486812 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fe05756-9202-4514-8eea-0c786a2b6d56-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.486858 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.487616 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.497174 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fe05756-9202-4514-8eea-0c786a2b6d56-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.499964 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54x5x\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-kube-api-access-54x5x\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.503638 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.570383 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.572461 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.576058 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.576135 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.576153 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.576182 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qcvjq" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.576219 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.576375 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.576540 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.582667 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.663536 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.680085 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.680125 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa3e6892-7a97-4563-b339-6c3acfd36dd3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.680233 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.681485 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9rjm\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-kube-api-access-r9rjm\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.681518 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.681547 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.681567 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.681583 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa3e6892-7a97-4563-b339-6c3acfd36dd3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.681609 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.681629 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-config-data\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.681650 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784187 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784240 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784266 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa3e6892-7a97-4563-b339-6c3acfd36dd3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784293 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784314 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-config-data\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784342 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784426 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784452 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa3e6892-7a97-4563-b339-6c3acfd36dd3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784460 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784476 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784522 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rjm\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-kube-api-access-r9rjm\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.784545 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.785243 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.785854 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.786049 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.787360 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.787979 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-config-data\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.794331 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa3e6892-7a97-4563-b339-6c3acfd36dd3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.794863 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.799471 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.802713 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa3e6892-7a97-4563-b339-6c3acfd36dd3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.807855 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.816447 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9rjm\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-kube-api-access-r9rjm\") pod \"rabbitmq-server-0\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " pod="openstack/rabbitmq-server-0" Mar 10 22:10:20 crc kubenswrapper[4919]: I0310 22:10:20.907921 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.196596 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" event={"ID":"4a25dadc-79b5-4535-a9b2-92a9b184119c","Type":"ContainerStarted","Data":"cfa8aa21c47f0d6277c154940903ea74991445112432f667004c73d07e790aa0"} Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.918434 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.920216 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.923609 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.923996 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.924222 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nbkzl" Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.925683 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.943643 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 22:10:21 crc kubenswrapper[4919]: I0310 22:10:21.954484 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.107437 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6qtw\" (UniqueName: \"kubernetes.io/projected/9372011b-416f-484d-a873-fdda67baf9fe-kube-api-access-f6qtw\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.109963 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.110088 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.110179 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.110270 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.110382 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.110468 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9372011b-416f-484d-a873-fdda67baf9fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.113100 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.215083 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.215362 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6qtw\" (UniqueName: \"kubernetes.io/projected/9372011b-416f-484d-a873-fdda67baf9fe-kube-api-access-f6qtw\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.215458 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.215505 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.215545 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.215593 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.215644 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.215672 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9372011b-416f-484d-a873-fdda67baf9fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.216508 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9372011b-416f-484d-a873-fdda67baf9fe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.217692 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-kolla-config\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.218494 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-config-data-default\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.218545 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.218791 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.221700 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.237045 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6qtw\" (UniqueName: \"kubernetes.io/projected/9372011b-416f-484d-a873-fdda67baf9fe-kube-api-access-f6qtw\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.241995 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.251984 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " pod="openstack/openstack-galera-0" Mar 10 22:10:22 crc kubenswrapper[4919]: I0310 22:10:22.258193 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.213384 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.215063 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.220046 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.220308 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.220490 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.220675 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rsnlb" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.227070 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.329716 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85b9\" (UniqueName: \"kubernetes.io/projected/37ef9179-69db-49ab-a4e6-2e2b815fc260-kube-api-access-c85b9\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.329801 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.329847 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.329883 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.330050 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.330109 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.330136 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.330158 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.431643 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.431704 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.431783 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.431840 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.431869 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.431895 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.431931 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c85b9\" (UniqueName: \"kubernetes.io/projected/37ef9179-69db-49ab-a4e6-2e2b815fc260-kube-api-access-c85b9\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.431978 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.432357 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.432649 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.433560 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.434450 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.435163 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.440097 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.445551 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.456192 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85b9\" (UniqueName: \"kubernetes.io/projected/37ef9179-69db-49ab-a4e6-2e2b815fc260-kube-api-access-c85b9\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.461383 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.498436 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.499504 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.502830 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.503002 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-k84j5" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.503568 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.523200 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.534522 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.636190 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-config-data\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.636242 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.636320 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-kolla-config\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.636522 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8lc\" (UniqueName: \"kubernetes.io/projected/a4a88061-cba8-4535-bf01-5285d8cbb79f-kube-api-access-sj8lc\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.636588 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.738317 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-kolla-config\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.738495 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8lc\" (UniqueName: \"kubernetes.io/projected/a4a88061-cba8-4535-bf01-5285d8cbb79f-kube-api-access-sj8lc\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.738554 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.738687 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-config-data\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.738746 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.740583 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-config-data\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.740723 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-kolla-config\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.742017 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.742709 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.756967 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8lc\" (UniqueName: \"kubernetes.io/projected/a4a88061-cba8-4535-bf01-5285d8cbb79f-kube-api-access-sj8lc\") pod \"memcached-0\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " pod="openstack/memcached-0" Mar 10 22:10:23 crc kubenswrapper[4919]: I0310 22:10:23.826111 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.311883 4919 scope.go:117] "RemoveContainer" containerID="1f9fd9c850ec222a6a45fed07ceeb8945bc2d1922898ba7879522d01facf1fa3" Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.539769 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.540973 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.543991 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-tmld7" Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.560804 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.667153 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrcds\" (UniqueName: \"kubernetes.io/projected/28cf0af6-9a5f-445a-98fc-2251bcd48109-kube-api-access-zrcds\") pod \"kube-state-metrics-0\" (UID: \"28cf0af6-9a5f-445a-98fc-2251bcd48109\") " pod="openstack/kube-state-metrics-0" Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.773639 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrcds\" (UniqueName: \"kubernetes.io/projected/28cf0af6-9a5f-445a-98fc-2251bcd48109-kube-api-access-zrcds\") pod \"kube-state-metrics-0\" (UID: \"28cf0af6-9a5f-445a-98fc-2251bcd48109\") " pod="openstack/kube-state-metrics-0" Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.793841 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrcds\" (UniqueName: \"kubernetes.io/projected/28cf0af6-9a5f-445a-98fc-2251bcd48109-kube-api-access-zrcds\") pod \"kube-state-metrics-0\" (UID: \"28cf0af6-9a5f-445a-98fc-2251bcd48109\") " pod="openstack/kube-state-metrics-0" Mar 10 22:10:25 crc kubenswrapper[4919]: I0310 22:10:25.864743 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.710712 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.717189 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.719132 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.719624 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-k4sh9" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.720343 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.720445 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.720608 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.722022 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.818694 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-config\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.818842 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.818910 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.818939 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.818987 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.819163 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.819258 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzs87\" (UniqueName: \"kubernetes.io/projected/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-kube-api-access-xzs87\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.819313 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921129 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921183 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921204 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921225 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921303 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921336 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzs87\" (UniqueName: \"kubernetes.io/projected/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-kube-api-access-xzs87\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921355 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921386 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-config\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.921688 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.922049 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-config\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.922175 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.922628 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.932034 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.932034 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.943747 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.944469 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:28 crc kubenswrapper[4919]: I0310 22:10:28.951765 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzs87\" (UniqueName: \"kubernetes.io/projected/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-kube-api-access-xzs87\") pod \"ovsdbserver-nb-0\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:29 crc kubenswrapper[4919]: I0310 22:10:29.045159 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:29 crc kubenswrapper[4919]: I0310 22:10:29.987194 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fbfnm"] Mar 10 22:10:29 crc kubenswrapper[4919]: I0310 22:10:29.988691 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:29 crc kubenswrapper[4919]: I0310 22:10:29.992010 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-pg8rm" Mar 10 22:10:29 crc kubenswrapper[4919]: I0310 22:10:29.992354 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 22:10:29 crc kubenswrapper[4919]: I0310 22:10:29.992425 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.000247 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fbfnm"] Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.035097 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5wz82"] Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.036518 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.059047 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5wz82"] Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.142890 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq5jc\" (UniqueName: \"kubernetes.io/projected/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-kube-api-access-lq5jc\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.142944 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-scripts\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.142977 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143002 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-log-ovn\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143158 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-etc-ovs\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143199 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-lib\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143268 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-ovn-controller-tls-certs\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143325 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a525725f-407a-4e99-96a1-a0eaba714487-scripts\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143370 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-log\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143467 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-combined-ca-bundle\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143516 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tgb2\" (UniqueName: \"kubernetes.io/projected/a525725f-407a-4e99-96a1-a0eaba714487-kube-api-access-5tgb2\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143559 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-run\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.143584 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run-ovn\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245123 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-etc-ovs\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245302 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-lib\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245357 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-ovn-controller-tls-certs\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245442 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a525725f-407a-4e99-96a1-a0eaba714487-scripts\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245489 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-log\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245568 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-combined-ca-bundle\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245611 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tgb2\" (UniqueName: \"kubernetes.io/projected/a525725f-407a-4e99-96a1-a0eaba714487-kube-api-access-5tgb2\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245664 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-run\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245704 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run-ovn\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245723 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-etc-ovs\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245793 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-lib\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245812 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq5jc\" (UniqueName: \"kubernetes.io/projected/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-kube-api-access-lq5jc\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245842 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-scripts\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245878 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245917 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-run\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.245963 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-log-ovn\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.246268 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-log-ovn\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.246384 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run-ovn\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.246405 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-log\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.246427 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.248160 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a525725f-407a-4e99-96a1-a0eaba714487-scripts\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.248289 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-scripts\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.258061 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-ovn-controller-tls-certs\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.278051 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-combined-ca-bundle\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.279618 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq5jc\" (UniqueName: \"kubernetes.io/projected/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-kube-api-access-lq5jc\") pod \"ovn-controller-fbfnm\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.281692 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tgb2\" (UniqueName: \"kubernetes.io/projected/a525725f-407a-4e99-96a1-a0eaba714487-kube-api-access-5tgb2\") pod \"ovn-controller-ovs-5wz82\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.339820 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:30 crc kubenswrapper[4919]: I0310 22:10:30.356986 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.085549 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.087448 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.090978 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lfp99" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.091130 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.091257 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.092102 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.093092 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.197384 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.197678 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.197708 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-config\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.197747 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.197785 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.197817 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.197862 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.197912 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htt2d\" (UniqueName: \"kubernetes.io/projected/8782e985-ff23-4580-bdfb-ef2dd9b540bc-kube-api-access-htt2d\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.299284 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.299624 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.299634 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.299764 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.299790 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htt2d\" (UniqueName: \"kubernetes.io/projected/8782e985-ff23-4580-bdfb-ef2dd9b540bc-kube-api-access-htt2d\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.299904 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.299941 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.299978 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-config\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.300003 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.303759 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.304072 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-config\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.304570 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.314620 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.314629 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.324572 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.330009 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htt2d\" (UniqueName: \"kubernetes.io/projected/8782e985-ff23-4580-bdfb-ef2dd9b540bc-kube-api-access-htt2d\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.330609 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.427354 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.511267 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.696828 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.900021 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 22:10:33 crc kubenswrapper[4919]: W0310 22:10:33.916119 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9372011b_416f_484d_a873_fdda67baf9fe.slice/crio-a81aced24f8ba96209b6fd098a3080c41e122cc2baa8d4abc6cd37a16a08dc96 WatchSource:0}: Error finding container a81aced24f8ba96209b6fd098a3080c41e122cc2baa8d4abc6cd37a16a08dc96: Status 404 returned error can't find the container with id a81aced24f8ba96209b6fd098a3080c41e122cc2baa8d4abc6cd37a16a08dc96 Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.921105 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 22:10:33 crc kubenswrapper[4919]: W0310 22:10:33.923454 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37ef9179_69db_49ab_a4e6_2e2b815fc260.slice/crio-5526a0e2dbd7db7dba05ceea6c12956ec629b63fa1bcbee8601c93257949fe8b WatchSource:0}: Error finding container 5526a0e2dbd7db7dba05ceea6c12956ec629b63fa1bcbee8601c93257949fe8b: Status 404 returned error can't find the container with id 5526a0e2dbd7db7dba05ceea6c12956ec629b63fa1bcbee8601c93257949fe8b Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.928946 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 22:10:33 crc kubenswrapper[4919]: W0310 22:10:33.933617 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28cf0af6_9a5f_445a_98fc_2251bcd48109.slice/crio-9e143d8bdc4a66e7c6dc9baeb6f30a72c138caba4c1f6093b7ab272417424c34 WatchSource:0}: Error finding container 9e143d8bdc4a66e7c6dc9baeb6f30a72c138caba4c1f6093b7ab272417424c34: Status 404 returned error can't find the container with id 9e143d8bdc4a66e7c6dc9baeb6f30a72c138caba4c1f6093b7ab272417424c34 Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.940107 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:10:33 crc kubenswrapper[4919]: I0310 22:10:33.946198 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fbfnm"] Mar 10 22:10:33 crc kubenswrapper[4919]: W0310 22:10:33.966248 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783e3f3a_7a6f_4b95_a7d2_6988c8a6149b.slice/crio-3bc851905a66630d9ff057ffa31ad0069018b7e623a4e77511edc9d62258d0db WatchSource:0}: Error finding container 3bc851905a66630d9ff057ffa31ad0069018b7e623a4e77511edc9d62258d0db: Status 404 returned error can't find the container with id 3bc851905a66630d9ff057ffa31ad0069018b7e623a4e77511edc9d62258d0db Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.021284 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 22:10:34 crc kubenswrapper[4919]: W0310 22:10:34.033377 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e69198f_c4ad_40c4_b0f4_1a6e9dd17940.slice/crio-3113a147d726107b80aa072cdf5b7d9c13082cb21a706f52b9eed76ba756222d WatchSource:0}: Error finding container 3113a147d726107b80aa072cdf5b7d9c13082cb21a706f52b9eed76ba756222d: Status 404 returned error can't find the container with id 3113a147d726107b80aa072cdf5b7d9c13082cb21a706f52b9eed76ba756222d Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.171636 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 22:10:34 crc kubenswrapper[4919]: W0310 22:10:34.173851 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8782e985_ff23_4580_bdfb_ef2dd9b540bc.slice/crio-499235ca0c8240a2e542faae80a04221ced8eed0d502e7f5ba10a70192104c54 WatchSource:0}: Error finding container 499235ca0c8240a2e542faae80a04221ced8eed0d502e7f5ba10a70192104c54: Status 404 returned error can't find the container with id 499235ca0c8240a2e542faae80a04221ced8eed0d502e7f5ba10a70192104c54 Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.281475 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5wz82"] Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.323737 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4a88061-cba8-4535-bf01-5285d8cbb79f","Type":"ContainerStarted","Data":"f59a877853ec059b4132b0cedbe4db1e550e4486c4e86be55ab299173c3b9e43"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.325658 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fe05756-9202-4514-8eea-0c786a2b6d56","Type":"ContainerStarted","Data":"13e0fb5aa9484b55f86cc7cbbfe20d7f1e22a61a4af9e0dc5e34c9452254e23c"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.327513 4919 generic.go:334] "Generic (PLEG): container finished" podID="4a25dadc-79b5-4535-a9b2-92a9b184119c" containerID="eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b" exitCode=0 Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.327558 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" event={"ID":"4a25dadc-79b5-4535-a9b2-92a9b184119c","Type":"ContainerDied","Data":"eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.331162 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"28cf0af6-9a5f-445a-98fc-2251bcd48109","Type":"ContainerStarted","Data":"9e143d8bdc4a66e7c6dc9baeb6f30a72c138caba4c1f6093b7ab272417424c34"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.334137 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa3e6892-7a97-4563-b339-6c3acfd36dd3","Type":"ContainerStarted","Data":"42e225731776032e5374a48a26b6d33c41f6b89f012d9128261ba5561d589429"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.336940 4919 generic.go:334] "Generic (PLEG): container finished" podID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" containerID="bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae" exitCode=0 Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.337025 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" event={"ID":"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91","Type":"ContainerDied","Data":"bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.341001 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940","Type":"ContainerStarted","Data":"3113a147d726107b80aa072cdf5b7d9c13082cb21a706f52b9eed76ba756222d"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.351361 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" event={"ID":"b139c9e1-5c8d-4bfa-a222-c1538328e8c9","Type":"ContainerDied","Data":"bec738b8a2d49fa787c5e8121c5f4e4b2d0a225c469458762d7945aad671cd87"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.351448 4919 generic.go:334] "Generic (PLEG): container finished" podID="b139c9e1-5c8d-4bfa-a222-c1538328e8c9" containerID="bec738b8a2d49fa787c5e8121c5f4e4b2d0a225c469458762d7945aad671cd87" exitCode=0 Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.354029 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5wz82" event={"ID":"a525725f-407a-4e99-96a1-a0eaba714487","Type":"ContainerStarted","Data":"8fac25358721d7097b2809907958892a7fbbaccefea7a7762f3c56886df1d01d"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.355314 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fbfnm" event={"ID":"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b","Type":"ContainerStarted","Data":"3bc851905a66630d9ff057ffa31ad0069018b7e623a4e77511edc9d62258d0db"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.356510 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37ef9179-69db-49ab-a4e6-2e2b815fc260","Type":"ContainerStarted","Data":"5526a0e2dbd7db7dba05ceea6c12956ec629b63fa1bcbee8601c93257949fe8b"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.357681 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9372011b-416f-484d-a873-fdda67baf9fe","Type":"ContainerStarted","Data":"a81aced24f8ba96209b6fd098a3080c41e122cc2baa8d4abc6cd37a16a08dc96"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.361912 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8782e985-ff23-4580-bdfb-ef2dd9b540bc","Type":"ContainerStarted","Data":"499235ca0c8240a2e542faae80a04221ced8eed0d502e7f5ba10a70192104c54"} Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.386552 4919 generic.go:334] "Generic (PLEG): container finished" podID="a3e8d845-c249-45ce-9182-3eca0f0e98f6" containerID="ca0c59595e37857cc76b5edb377f2545e6b937a24d7c26d051bd309ead8861d8" exitCode=0 Mar 10 22:10:34 crc kubenswrapper[4919]: I0310 22:10:34.386604 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" event={"ID":"a3e8d845-c249-45ce-9182-3eca0f0e98f6","Type":"ContainerDied","Data":"ca0c59595e37857cc76b5edb377f2545e6b937a24d7c26d051bd309ead8861d8"} Mar 10 22:10:34 crc kubenswrapper[4919]: E0310 22:10:34.601765 4919 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 10 22:10:34 crc kubenswrapper[4919]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 22:10:34 crc kubenswrapper[4919]: > podSandboxID="56c0efb538fc45caec58c062843e8358cb5f0f5e822a31f9130e7aa0a92df118" Mar 10 22:10:34 crc kubenswrapper[4919]: E0310 22:10:34.602291 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 22:10:34 crc kubenswrapper[4919]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjlgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78cb4465c9-9mcfd_openstack(b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 22:10:34 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 22:10:34 crc kubenswrapper[4919]: E0310 22:10:34.603667 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" podUID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.172948 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.189332 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.246869 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-dns-svc\") pod \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.246998 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-config\") pod \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.247044 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e8d845-c249-45ce-9182-3eca0f0e98f6-config\") pod \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\" (UID: \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\") " Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.247072 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vkfb\" (UniqueName: \"kubernetes.io/projected/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-kube-api-access-5vkfb\") pod \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\" (UID: \"b139c9e1-5c8d-4bfa-a222-c1538328e8c9\") " Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.247159 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxd89\" (UniqueName: \"kubernetes.io/projected/a3e8d845-c249-45ce-9182-3eca0f0e98f6-kube-api-access-gxd89\") pod \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\" (UID: \"a3e8d845-c249-45ce-9182-3eca0f0e98f6\") " Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.252598 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e8d845-c249-45ce-9182-3eca0f0e98f6-kube-api-access-gxd89" (OuterVolumeSpecName: "kube-api-access-gxd89") pod "a3e8d845-c249-45ce-9182-3eca0f0e98f6" (UID: "a3e8d845-c249-45ce-9182-3eca0f0e98f6"). InnerVolumeSpecName "kube-api-access-gxd89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.253483 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-kube-api-access-5vkfb" (OuterVolumeSpecName: "kube-api-access-5vkfb") pod "b139c9e1-5c8d-4bfa-a222-c1538328e8c9" (UID: "b139c9e1-5c8d-4bfa-a222-c1538328e8c9"). InnerVolumeSpecName "kube-api-access-5vkfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.266261 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e8d845-c249-45ce-9182-3eca0f0e98f6-config" (OuterVolumeSpecName: "config") pod "a3e8d845-c249-45ce-9182-3eca0f0e98f6" (UID: "a3e8d845-c249-45ce-9182-3eca0f0e98f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.267311 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b139c9e1-5c8d-4bfa-a222-c1538328e8c9" (UID: "b139c9e1-5c8d-4bfa-a222-c1538328e8c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.268449 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-config" (OuterVolumeSpecName: "config") pod "b139c9e1-5c8d-4bfa-a222-c1538328e8c9" (UID: "b139c9e1-5c8d-4bfa-a222-c1538328e8c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.349482 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxd89\" (UniqueName: \"kubernetes.io/projected/a3e8d845-c249-45ce-9182-3eca0f0e98f6-kube-api-access-gxd89\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.349518 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.349530 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.349541 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e8d845-c249-45ce-9182-3eca0f0e98f6-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.349553 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vkfb\" (UniqueName: \"kubernetes.io/projected/b139c9e1-5c8d-4bfa-a222-c1538328e8c9-kube-api-access-5vkfb\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.405025 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" event={"ID":"a3e8d845-c249-45ce-9182-3eca0f0e98f6","Type":"ContainerDied","Data":"cc5993ee0e3ed564c557fb69f3eaf0e0d97c45f6a4f999adb5073b021b1e65f2"} Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.405068 4919 scope.go:117] "RemoveContainer" containerID="ca0c59595e37857cc76b5edb377f2545e6b937a24d7c26d051bd309ead8861d8" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.405173 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-tbhrn" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.412877 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.413217 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-4hc2l" event={"ID":"b139c9e1-5c8d-4bfa-a222-c1538328e8c9","Type":"ContainerDied","Data":"0a3b99714a84830a629dd46381de7e487361a88601a2ef8aea9b68d7a4ecbe7d"} Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.418761 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" event={"ID":"4a25dadc-79b5-4535-a9b2-92a9b184119c","Type":"ContainerStarted","Data":"f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea"} Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.419295 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.443296 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" podStartSLOduration=3.475998349 podStartE2EDuration="16.443278572s" podCreationTimestamp="2026-03-10 22:10:19 +0000 UTC" firstStartedPulling="2026-03-10 22:10:20.302882641 +0000 UTC m=+1207.544763259" lastFinishedPulling="2026-03-10 22:10:33.270162874 +0000 UTC m=+1220.512043482" observedRunningTime="2026-03-10 22:10:35.436908589 +0000 UTC m=+1222.678789197" watchObservedRunningTime="2026-03-10 22:10:35.443278572 +0000 UTC m=+1222.685159180" Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.515847 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4hc2l"] Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.536742 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-4hc2l"] Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.552705 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-tbhrn"] Mar 10 22:10:35 crc kubenswrapper[4919]: I0310 22:10:35.586339 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-tbhrn"] Mar 10 22:10:37 crc kubenswrapper[4919]: I0310 22:10:37.493225 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3e8d845-c249-45ce-9182-3eca0f0e98f6" path="/var/lib/kubelet/pods/a3e8d845-c249-45ce-9182-3eca0f0e98f6/volumes" Mar 10 22:10:37 crc kubenswrapper[4919]: I0310 22:10:37.494076 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b139c9e1-5c8d-4bfa-a222-c1538328e8c9" path="/var/lib/kubelet/pods/b139c9e1-5c8d-4bfa-a222-c1538328e8c9/volumes" Mar 10 22:10:37 crc kubenswrapper[4919]: I0310 22:10:37.721820 4919 scope.go:117] "RemoveContainer" containerID="bec738b8a2d49fa787c5e8121c5f4e4b2d0a225c469458762d7945aad671cd87" Mar 10 22:10:39 crc kubenswrapper[4919]: I0310 22:10:39.814578 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:10:39 crc kubenswrapper[4919]: I0310 22:10:39.884063 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-9mcfd"] Mar 10 22:10:44 crc kubenswrapper[4919]: I0310 22:10:44.483201 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" event={"ID":"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91","Type":"ContainerStarted","Data":"9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743"} Mar 10 22:10:44 crc kubenswrapper[4919]: I0310 22:10:44.483738 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:44 crc kubenswrapper[4919]: I0310 22:10:44.483318 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" podUID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" containerName="dnsmasq-dns" containerID="cri-o://9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743" gracePeriod=10 Mar 10 22:10:44 crc kubenswrapper[4919]: I0310 22:10:44.919508 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.090387 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-config\") pod \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.090749 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-dns-svc\") pod \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.090792 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjlgk\" (UniqueName: \"kubernetes.io/projected/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-kube-api-access-cjlgk\") pod \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\" (UID: \"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91\") " Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.181650 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-kube-api-access-cjlgk" (OuterVolumeSpecName: "kube-api-access-cjlgk") pod "b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" (UID: "b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91"). InnerVolumeSpecName "kube-api-access-cjlgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.192010 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjlgk\" (UniqueName: \"kubernetes.io/projected/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-kube-api-access-cjlgk\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.231208 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-config" (OuterVolumeSpecName: "config") pod "b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" (UID: "b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.294198 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.416185 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" (UID: "b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.496369 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8782e985-ff23-4580-bdfb-ef2dd9b540bc","Type":"ContainerStarted","Data":"27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9"} Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.497277 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.500078 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9372011b-416f-484d-a873-fdda67baf9fe","Type":"ContainerStarted","Data":"9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c"} Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.502820 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4a88061-cba8-4535-bf01-5285d8cbb79f","Type":"ContainerStarted","Data":"fb28d2d8eb9c98873f08b6f1830499a051d20df33a610f4a9e6624fa224475b0"} Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.503011 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.504776 4919 generic.go:334] "Generic (PLEG): container finished" podID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" containerID="9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743" exitCode=0 Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.504835 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.504843 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" event={"ID":"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91","Type":"ContainerDied","Data":"9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743"} Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.504866 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-9mcfd" event={"ID":"b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91","Type":"ContainerDied","Data":"56c0efb538fc45caec58c062843e8358cb5f0f5e822a31f9130e7aa0a92df118"} Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.504887 4919 scope.go:117] "RemoveContainer" containerID="9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.506886 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fbfnm" event={"ID":"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b","Type":"ContainerStarted","Data":"f38ac54b5abf8ebe29460d44b16de61bf12705b9f6a4ac5d48ab1694b6482b7e"} Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.507083 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fbfnm" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.509294 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37ef9179-69db-49ab-a4e6-2e2b815fc260","Type":"ContainerStarted","Data":"8c17bba3a2afe23e23c08f322962c8fa5f82fede04d8db1307f9c7ffd15139b9"} Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.539808 4919 scope.go:117] "RemoveContainer" containerID="bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.568591 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.088169074 podStartE2EDuration="22.568572593s" podCreationTimestamp="2026-03-10 22:10:23 +0000 UTC" firstStartedPulling="2026-03-10 22:10:33.918070469 +0000 UTC m=+1221.159951077" lastFinishedPulling="2026-03-10 22:10:43.398473988 +0000 UTC m=+1230.640354596" observedRunningTime="2026-03-10 22:10:45.562046627 +0000 UTC m=+1232.803927235" watchObservedRunningTime="2026-03-10 22:10:45.568572593 +0000 UTC m=+1232.810453201" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.582515 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fbfnm" podStartSLOduration=6.574625979 podStartE2EDuration="16.58249965s" podCreationTimestamp="2026-03-10 22:10:29 +0000 UTC" firstStartedPulling="2026-03-10 22:10:33.969724027 +0000 UTC m=+1221.211604625" lastFinishedPulling="2026-03-10 22:10:43.977597688 +0000 UTC m=+1231.219478296" observedRunningTime="2026-03-10 22:10:45.578121222 +0000 UTC m=+1232.820001830" watchObservedRunningTime="2026-03-10 22:10:45.58249965 +0000 UTC m=+1232.824380248" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.599892 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-9mcfd"] Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.602710 4919 scope.go:117] "RemoveContainer" containerID="9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743" Mar 10 22:10:45 crc kubenswrapper[4919]: E0310 22:10:45.606061 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743\": container with ID starting with 9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743 not found: ID does not exist" containerID="9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.606104 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743"} err="failed to get container status \"9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743\": rpc error: code = NotFound desc = could not find container \"9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743\": container with ID starting with 9b9fdd2754f19c4d1279c77600be9b16074de7c9a36e67b013fc30a94f349743 not found: ID does not exist" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.606162 4919 scope.go:117] "RemoveContainer" containerID="bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae" Mar 10 22:10:45 crc kubenswrapper[4919]: E0310 22:10:45.607897 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae\": container with ID starting with bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae not found: ID does not exist" containerID="bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.607924 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae"} err="failed to get container status \"bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae\": rpc error: code = NotFound desc = could not find container \"bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae\": container with ID starting with bade2a22652b6d49a5de1198cfde0f6d2e41dde87bd9f776ec58431dca5b16ae not found: ID does not exist" Mar 10 22:10:45 crc kubenswrapper[4919]: I0310 22:10:45.608163 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-9mcfd"] Mar 10 22:10:46 crc kubenswrapper[4919]: I0310 22:10:46.518580 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa3e6892-7a97-4563-b339-6c3acfd36dd3","Type":"ContainerStarted","Data":"1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3"} Mar 10 22:10:46 crc kubenswrapper[4919]: I0310 22:10:46.524155 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fe05756-9202-4514-8eea-0c786a2b6d56","Type":"ContainerStarted","Data":"751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99"} Mar 10 22:10:46 crc kubenswrapper[4919]: I0310 22:10:46.525925 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940","Type":"ContainerStarted","Data":"7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44"} Mar 10 22:10:46 crc kubenswrapper[4919]: I0310 22:10:46.527918 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"28cf0af6-9a5f-445a-98fc-2251bcd48109","Type":"ContainerStarted","Data":"10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c"} Mar 10 22:10:46 crc kubenswrapper[4919]: I0310 22:10:46.528348 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 22:10:46 crc kubenswrapper[4919]: I0310 22:10:46.530129 4919 generic.go:334] "Generic (PLEG): container finished" podID="a525725f-407a-4e99-96a1-a0eaba714487" containerID="d7ed0dd7cc3ffd81be36edf66e7963bea6d12e052bbb3d488461d7172dab3b1a" exitCode=0 Mar 10 22:10:46 crc kubenswrapper[4919]: I0310 22:10:46.531680 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5wz82" event={"ID":"a525725f-407a-4e99-96a1-a0eaba714487","Type":"ContainerDied","Data":"d7ed0dd7cc3ffd81be36edf66e7963bea6d12e052bbb3d488461d7172dab3b1a"} Mar 10 22:10:46 crc kubenswrapper[4919]: I0310 22:10:46.569664 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.999159733 podStartE2EDuration="21.569642806s" podCreationTimestamp="2026-03-10 22:10:25 +0000 UTC" firstStartedPulling="2026-03-10 22:10:33.940747653 +0000 UTC m=+1221.182628261" lastFinishedPulling="2026-03-10 22:10:44.511230726 +0000 UTC m=+1231.753111334" observedRunningTime="2026-03-10 22:10:46.567680623 +0000 UTC m=+1233.809561241" watchObservedRunningTime="2026-03-10 22:10:46.569642806 +0000 UTC m=+1233.811523414" Mar 10 22:10:47 crc kubenswrapper[4919]: I0310 22:10:47.490289 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" path="/var/lib/kubelet/pods/b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91/volumes" Mar 10 22:10:47 crc kubenswrapper[4919]: I0310 22:10:47.540436 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5wz82" event={"ID":"a525725f-407a-4e99-96a1-a0eaba714487","Type":"ContainerStarted","Data":"85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880"} Mar 10 22:10:49 crc kubenswrapper[4919]: I0310 22:10:49.555777 4919 generic.go:334] "Generic (PLEG): container finished" podID="37ef9179-69db-49ab-a4e6-2e2b815fc260" containerID="8c17bba3a2afe23e23c08f322962c8fa5f82fede04d8db1307f9c7ffd15139b9" exitCode=0 Mar 10 22:10:49 crc kubenswrapper[4919]: I0310 22:10:49.555870 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37ef9179-69db-49ab-a4e6-2e2b815fc260","Type":"ContainerDied","Data":"8c17bba3a2afe23e23c08f322962c8fa5f82fede04d8db1307f9c7ffd15139b9"} Mar 10 22:10:49 crc kubenswrapper[4919]: I0310 22:10:49.558835 4919 generic.go:334] "Generic (PLEG): container finished" podID="9372011b-416f-484d-a873-fdda67baf9fe" containerID="9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c" exitCode=0 Mar 10 22:10:49 crc kubenswrapper[4919]: I0310 22:10:49.558909 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9372011b-416f-484d-a873-fdda67baf9fe","Type":"ContainerDied","Data":"9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c"} Mar 10 22:10:49 crc kubenswrapper[4919]: I0310 22:10:49.562709 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5wz82" event={"ID":"a525725f-407a-4e99-96a1-a0eaba714487","Type":"ContainerStarted","Data":"e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5"} Mar 10 22:10:50 crc kubenswrapper[4919]: I0310 22:10:50.571374 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:50 crc kubenswrapper[4919]: I0310 22:10:50.602225 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5wz82" podStartSLOduration=10.907058386 podStartE2EDuration="20.602205693s" podCreationTimestamp="2026-03-10 22:10:30 +0000 UTC" firstStartedPulling="2026-03-10 22:10:34.293411368 +0000 UTC m=+1221.535291986" lastFinishedPulling="2026-03-10 22:10:43.988558685 +0000 UTC m=+1231.230439293" observedRunningTime="2026-03-10 22:10:50.596412726 +0000 UTC m=+1237.838293354" watchObservedRunningTime="2026-03-10 22:10:50.602205693 +0000 UTC m=+1237.844086321" Mar 10 22:10:51 crc kubenswrapper[4919]: I0310 22:10:51.579207 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:10:54 crc kubenswrapper[4919]: I0310 22:10:53.828650 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.388938 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.531007 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7df9cc56b7-zv8nm"] Mar 10 22:10:56 crc kubenswrapper[4919]: E0310 22:10:56.531407 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b139c9e1-5c8d-4bfa-a222-c1538328e8c9" containerName="init" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.531429 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b139c9e1-5c8d-4bfa-a222-c1538328e8c9" containerName="init" Mar 10 22:10:56 crc kubenswrapper[4919]: E0310 22:10:56.531456 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" containerName="dnsmasq-dns" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.531464 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" containerName="dnsmasq-dns" Mar 10 22:10:56 crc kubenswrapper[4919]: E0310 22:10:56.531483 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3e8d845-c249-45ce-9182-3eca0f0e98f6" containerName="init" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.531489 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3e8d845-c249-45ce-9182-3eca0f0e98f6" containerName="init" Mar 10 22:10:56 crc kubenswrapper[4919]: E0310 22:10:56.531507 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" containerName="init" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.531514 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" containerName="init" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.531644 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e7e9b6-1fd1-4fff-a8a2-2991997c8e91" containerName="dnsmasq-dns" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.531653 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b139c9e1-5c8d-4bfa-a222-c1538328e8c9" containerName="init" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.531671 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3e8d845-c249-45ce-9182-3eca0f0e98f6" containerName="init" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.532644 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.540202 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7df9cc56b7-zv8nm"] Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.548599 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwt2\" (UniqueName: \"kubernetes.io/projected/021a8737-7317-441c-835f-e9cdc78ad6c4-kube-api-access-2wwt2\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.548652 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-dns-svc\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.548690 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-config\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.649888 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwt2\" (UniqueName: \"kubernetes.io/projected/021a8737-7317-441c-835f-e9cdc78ad6c4-kube-api-access-2wwt2\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.649962 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-dns-svc\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.649994 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-config\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.650835 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-config\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.650847 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-dns-svc\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.667534 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwt2\" (UniqueName: \"kubernetes.io/projected/021a8737-7317-441c-835f-e9cdc78ad6c4-kube-api-access-2wwt2\") pod \"dnsmasq-dns-7df9cc56b7-zv8nm\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:56 crc kubenswrapper[4919]: I0310 22:10:56.853860 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:10:57 crc kubenswrapper[4919]: E0310 22:10:57.317334 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c" Mar 10 22:10:57 crc kubenswrapper[4919]: E0310 22:10:57.317806 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:nd6h7bhb9h65h57fh559h57bhc5h577h67bh584h67bh5ffh665hc9hfdh557h8hch699h546hb4h5bch684h9fh587h67h557h86h5f9h5bdh5dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xzs87,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(9e69198f-c4ad-40c4-b0f4-1a6e9dd17940): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 22:10:57 crc kubenswrapper[4919]: E0310 22:10:57.318998 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" Mar 10 22:10:57 crc kubenswrapper[4919]: I0310 22:10:57.768179 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7df9cc56b7-zv8nm"] Mar 10 22:10:57 crc kubenswrapper[4919]: W0310 22:10:57.769983 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod021a8737_7317_441c_835f_e9cdc78ad6c4.slice/crio-a8e31304a4028fc19fbac1eae7fae1eef5c1c788e8b71d0a61c916babc928ef8 WatchSource:0}: Error finding container a8e31304a4028fc19fbac1eae7fae1eef5c1c788e8b71d0a61c916babc928ef8: Status 404 returned error can't find the container with id a8e31304a4028fc19fbac1eae7fae1eef5c1c788e8b71d0a61c916babc928ef8 Mar 10 22:10:57 crc kubenswrapper[4919]: I0310 22:10:57.953639 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 22:10:57 crc kubenswrapper[4919]: I0310 22:10:57.984926 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 22:10:57 crc kubenswrapper[4919]: I0310 22:10:57.987175 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 22:10:57 crc kubenswrapper[4919]: I0310 22:10:57.988491 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rcmv5" Mar 10 22:10:57 crc kubenswrapper[4919]: I0310 22:10:57.988677 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 22:10:57 crc kubenswrapper[4919]: I0310 22:10:57.988808 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 22:10:57 crc kubenswrapper[4919]: I0310 22:10:57.994801 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.175321 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-lock\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.175373 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.175563 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lj4w\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-kube-api-access-4lj4w\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.175595 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-cache\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.175674 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.175937 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c8bbf6-8824-4e21-a491-86f2f657549a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.277756 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lj4w\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-kube-api-access-4lj4w\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.277826 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-cache\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.277857 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.277941 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c8bbf6-8824-4e21-a491-86f2f657549a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.277977 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.277998 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-lock\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.279468 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-cache\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: E0310 22:10:58.279672 4919 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 22:10:58 crc kubenswrapper[4919]: E0310 22:10:58.279715 4919 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.279773 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: E0310 22:10:58.279801 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift podName:91c8bbf6-8824-4e21-a491-86f2f657549a nodeName:}" failed. No retries permitted until 2026-03-10 22:10:58.779778977 +0000 UTC m=+1246.021659585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift") pod "swift-storage-0" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a") : configmap "swift-ring-files" not found Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.280601 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-lock\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.289275 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c8bbf6-8824-4e21-a491-86f2f657549a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.299501 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lj4w\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-kube-api-access-4lj4w\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.320617 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.372889 4919 generic.go:334] "Generic (PLEG): container finished" podID="021a8737-7317-441c-835f-e9cdc78ad6c4" containerID="4ca47abb5626cac1353b95fccc6a37d01a6f5ecaabb850ee8c399099faf42f23" exitCode=0 Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.372944 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" event={"ID":"021a8737-7317-441c-835f-e9cdc78ad6c4","Type":"ContainerDied","Data":"4ca47abb5626cac1353b95fccc6a37d01a6f5ecaabb850ee8c399099faf42f23"} Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.372969 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" event={"ID":"021a8737-7317-441c-835f-e9cdc78ad6c4","Type":"ContainerStarted","Data":"a8e31304a4028fc19fbac1eae7fae1eef5c1c788e8b71d0a61c916babc928ef8"} Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.375614 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37ef9179-69db-49ab-a4e6-2e2b815fc260","Type":"ContainerStarted","Data":"3a27cd1d1040a8d26e69883aaeb7f8132c3852f88812a287ec745156fd408f80"} Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.377955 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8782e985-ff23-4580-bdfb-ef2dd9b540bc","Type":"ContainerStarted","Data":"fa482cd6ab218644c93efa05d2493a56acc39a947142b9aa93cab897fbd8faee"} Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.381103 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940","Type":"ContainerStarted","Data":"1cbaf6bf606b116c1dcc6c8ceb022ba5bf13acdeef539d78bf1c08e4dba722aa"} Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.383214 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9372011b-416f-484d-a873-fdda67baf9fe","Type":"ContainerStarted","Data":"c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4"} Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.423335 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.283536381 podStartE2EDuration="36.423295495s" podCreationTimestamp="2026-03-10 22:10:22 +0000 UTC" firstStartedPulling="2026-03-10 22:10:33.924914553 +0000 UTC m=+1221.166795161" lastFinishedPulling="2026-03-10 22:10:44.064673667 +0000 UTC m=+1231.306554275" observedRunningTime="2026-03-10 22:10:58.418263239 +0000 UTC m=+1245.660143847" watchObservedRunningTime="2026-03-10 22:10:58.423295495 +0000 UTC m=+1245.665176133" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.428144 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.443544 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.443168958 podStartE2EDuration="38.443523674s" podCreationTimestamp="2026-03-10 22:10:20 +0000 UTC" firstStartedPulling="2026-03-10 22:10:33.91850044 +0000 UTC m=+1221.160381048" lastFinishedPulling="2026-03-10 22:10:43.918855156 +0000 UTC m=+1231.160735764" observedRunningTime="2026-03-10 22:10:58.439123115 +0000 UTC m=+1245.681003723" watchObservedRunningTime="2026-03-10 22:10:58.443523674 +0000 UTC m=+1245.685404282" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.462889 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.29147927 podStartE2EDuration="26.462873768s" podCreationTimestamp="2026-03-10 22:10:32 +0000 UTC" firstStartedPulling="2026-03-10 22:10:34.175838663 +0000 UTC m=+1221.417719271" lastFinishedPulling="2026-03-10 22:10:57.347233161 +0000 UTC m=+1244.589113769" observedRunningTime="2026-03-10 22:10:58.457487042 +0000 UTC m=+1245.699367650" watchObservedRunningTime="2026-03-10 22:10:58.462873768 +0000 UTC m=+1245.704754376" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.483261 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=21.59905253 podStartE2EDuration="31.483244819s" podCreationTimestamp="2026-03-10 22:10:27 +0000 UTC" firstStartedPulling="2026-03-10 22:10:34.034680868 +0000 UTC m=+1221.276561476" lastFinishedPulling="2026-03-10 22:10:43.918873147 +0000 UTC m=+1231.160753765" observedRunningTime="2026-03-10 22:10:58.476366053 +0000 UTC m=+1245.718246661" watchObservedRunningTime="2026-03-10 22:10:58.483244819 +0000 UTC m=+1245.725125427" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.505784 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jm7n6"] Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.506786 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.508620 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.508717 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.513569 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.517972 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jm7n6"] Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.685171 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0067a7fe-f5db-4832-a519-848ac8b771c0-etc-swift\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.685245 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-dispersionconf\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.685267 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-ring-data-devices\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.685314 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrfgj\" (UniqueName: \"kubernetes.io/projected/0067a7fe-f5db-4832-a519-848ac8b771c0-kube-api-access-vrfgj\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.685367 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-scripts\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.685383 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-swiftconf\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.685452 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-combined-ca-bundle\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.786279 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-dispersionconf\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.786337 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-ring-data-devices\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.786354 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrfgj\" (UniqueName: \"kubernetes.io/projected/0067a7fe-f5db-4832-a519-848ac8b771c0-kube-api-access-vrfgj\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.786472 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-scripts\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.786504 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-swiftconf\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.786652 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-combined-ca-bundle\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.786712 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0067a7fe-f5db-4832-a519-848ac8b771c0-etc-swift\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.786736 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:58 crc kubenswrapper[4919]: E0310 22:10:58.786847 4919 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 22:10:58 crc kubenswrapper[4919]: E0310 22:10:58.786860 4919 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 22:10:58 crc kubenswrapper[4919]: E0310 22:10:58.786908 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift podName:91c8bbf6-8824-4e21-a491-86f2f657549a nodeName:}" failed. No retries permitted until 2026-03-10 22:10:59.786895127 +0000 UTC m=+1247.028775735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift") pod "swift-storage-0" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a") : configmap "swift-ring-files" not found Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.787244 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0067a7fe-f5db-4832-a519-848ac8b771c0-etc-swift\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.787950 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-ring-data-devices\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.787983 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-scripts\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.791832 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-dispersionconf\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.791815 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-swiftconf\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.793034 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-combined-ca-bundle\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.805000 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrfgj\" (UniqueName: \"kubernetes.io/projected/0067a7fe-f5db-4832-a519-848ac8b771c0-kube-api-access-vrfgj\") pod \"swift-ring-rebalance-jm7n6\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:58 crc kubenswrapper[4919]: I0310 22:10:58.824106 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.046179 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.046236 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.088667 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.253157 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jm7n6"] Mar 10 22:10:59 crc kubenswrapper[4919]: W0310 22:10:59.257985 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0067a7fe_f5db_4832_a519_848ac8b771c0.slice/crio-09942638125ea4ed026cc395ea628cc10577f4a19c9585458a93c22d51766da8 WatchSource:0}: Error finding container 09942638125ea4ed026cc395ea628cc10577f4a19c9585458a93c22d51766da8: Status 404 returned error can't find the container with id 09942638125ea4ed026cc395ea628cc10577f4a19c9585458a93c22d51766da8 Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.391206 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jm7n6" event={"ID":"0067a7fe-f5db-4832-a519-848ac8b771c0","Type":"ContainerStarted","Data":"09942638125ea4ed026cc395ea628cc10577f4a19c9585458a93c22d51766da8"} Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.393023 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" event={"ID":"021a8737-7317-441c-835f-e9cdc78ad6c4","Type":"ContainerStarted","Data":"8032a1de5b9b38c6b121f564d0062bf5450157445837b30318ebc77c5d1c9240"} Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.465428 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.501325 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" podStartSLOduration=3.501306552 podStartE2EDuration="3.501306552s" podCreationTimestamp="2026-03-10 22:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:10:59.416763282 +0000 UTC m=+1246.658643910" watchObservedRunningTime="2026-03-10 22:10:59.501306552 +0000 UTC m=+1246.743187170" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.699575 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df9cc56b7-zv8nm"] Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.727483 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdc8d4b69-2c5p8"] Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.729110 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.731307 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.744122 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdc8d4b69-2c5p8"] Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.803247 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-config\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.803294 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52hn7\" (UniqueName: \"kubernetes.io/projected/474d4be4-538a-4463-a73c-68bbf7999276-kube-api-access-52hn7\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.803338 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.803455 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-dns-svc\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.803510 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: E0310 22:10:59.803696 4919 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 22:10:59 crc kubenswrapper[4919]: E0310 22:10:59.803712 4919 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 22:10:59 crc kubenswrapper[4919]: E0310 22:10:59.803756 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift podName:91c8bbf6-8824-4e21-a491-86f2f657549a nodeName:}" failed. No retries permitted until 2026-03-10 22:11:01.803739156 +0000 UTC m=+1249.045619764 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift") pod "swift-storage-0" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a") : configmap "swift-ring-files" not found Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.874526 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-wgnf2"] Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.876420 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.880533 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.882754 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wgnf2"] Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.906851 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-dns-svc\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.906913 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.906971 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-config\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.906988 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52hn7\" (UniqueName: \"kubernetes.io/projected/474d4be4-538a-4463-a73c-68bbf7999276-kube-api-access-52hn7\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.907898 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-dns-svc\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.907966 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-config\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.908372 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:10:59 crc kubenswrapper[4919]: I0310 22:10:59.950211 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52hn7\" (UniqueName: \"kubernetes.io/projected/474d4be4-538a-4463-a73c-68bbf7999276-kube-api-access-52hn7\") pod \"dnsmasq-dns-5fdc8d4b69-2c5p8\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.010229 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-combined-ca-bundle\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.010288 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.010338 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61256d0-9ca6-4524-a19d-7efd32ab9724-config\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.010359 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovn-rundir\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.010382 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftt6c\" (UniqueName: \"kubernetes.io/projected/e61256d0-9ca6-4524-a19d-7efd32ab9724-kube-api-access-ftt6c\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.010517 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovs-rundir\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.031101 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdc8d4b69-2c5p8"] Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.031691 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.056591 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-tz7jv"] Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.059583 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.063704 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.070080 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-tz7jv"] Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111386 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111462 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-config\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111490 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovs-rundir\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111513 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-combined-ca-bundle\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111643 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111809 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61256d0-9ca6-4524-a19d-7efd32ab9724-config\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111839 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovn-rundir\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111885 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftt6c\" (UniqueName: \"kubernetes.io/projected/e61256d0-9ca6-4524-a19d-7efd32ab9724-kube-api-access-ftt6c\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.111925 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-dns-svc\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.112042 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8frt\" (UniqueName: \"kubernetes.io/projected/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-kube-api-access-n8frt\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.112098 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.112686 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovs-rundir\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.112685 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovn-rundir\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.113518 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61256d0-9ca6-4524-a19d-7efd32ab9724-config\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.116207 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-combined-ca-bundle\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.119441 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.128975 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftt6c\" (UniqueName: \"kubernetes.io/projected/e61256d0-9ca6-4524-a19d-7efd32ab9724-kube-api-access-ftt6c\") pod \"ovn-controller-metrics-wgnf2\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.192694 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.213819 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8frt\" (UniqueName: \"kubernetes.io/projected/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-kube-api-access-n8frt\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.213895 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.213928 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.213981 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-config\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.214083 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-dns-svc\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.215257 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.216000 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-config\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.216073 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-dns-svc\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.219795 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.232459 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8frt\" (UniqueName: \"kubernetes.io/projected/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-kube-api-access-n8frt\") pod \"dnsmasq-dns-675f7dd995-tz7jv\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.405158 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.427663 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.453329 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.470542 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.538669 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdc8d4b69-2c5p8"] Mar 10 22:11:00 crc kubenswrapper[4919]: W0310 22:11:00.541054 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474d4be4_538a_4463_a73c_68bbf7999276.slice/crio-e230c38942f56d291a1e30b0b2c61bfa432de26c666f390fb081b9033ade6744 WatchSource:0}: Error finding container e230c38942f56d291a1e30b0b2c61bfa432de26c666f390fb081b9033ade6744: Status 404 returned error can't find the container with id e230c38942f56d291a1e30b0b2c61bfa432de26c666f390fb081b9033ade6744 Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.671714 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wgnf2"] Mar 10 22:11:00 crc kubenswrapper[4919]: I0310 22:11:00.889471 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-tz7jv"] Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.427844 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wgnf2" event={"ID":"e61256d0-9ca6-4524-a19d-7efd32ab9724","Type":"ContainerStarted","Data":"1bb99faa0c9dbd8195614221f50fa8ce1965d14085b52684adacebddfa8030f0"} Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.427887 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wgnf2" event={"ID":"e61256d0-9ca6-4524-a19d-7efd32ab9724","Type":"ContainerStarted","Data":"9c5bddf690b8b0b70c899f52c552126c69a3e2fab725a171603e151e6b5627c5"} Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.430896 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" event={"ID":"0fdc6e48-f4e3-4c70-8cab-78ad58edd483","Type":"ContainerStarted","Data":"822a032f8bc949b5c80d4aa4dc1301f9ff5b6767194a5fcb56a7dcf39d4d4c21"} Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.432818 4919 generic.go:334] "Generic (PLEG): container finished" podID="474d4be4-538a-4463-a73c-68bbf7999276" containerID="0bcca0341a220b7ebd9d7f86a75a36ef6571beed14498faa8fa899702ee01901" exitCode=0 Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.433050 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" podUID="021a8737-7317-441c-835f-e9cdc78ad6c4" containerName="dnsmasq-dns" containerID="cri-o://8032a1de5b9b38c6b121f564d0062bf5450157445837b30318ebc77c5d1c9240" gracePeriod=10 Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.433461 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" event={"ID":"474d4be4-538a-4463-a73c-68bbf7999276","Type":"ContainerDied","Data":"0bcca0341a220b7ebd9d7f86a75a36ef6571beed14498faa8fa899702ee01901"} Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.433511 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" event={"ID":"474d4be4-538a-4463-a73c-68bbf7999276","Type":"ContainerStarted","Data":"e230c38942f56d291a1e30b0b2c61bfa432de26c666f390fb081b9033ade6744"} Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.454912 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-wgnf2" podStartSLOduration=2.454894823 podStartE2EDuration="2.454894823s" podCreationTimestamp="2026-03-10 22:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:01.446934257 +0000 UTC m=+1248.688814875" watchObservedRunningTime="2026-03-10 22:11:01.454894823 +0000 UTC m=+1248.696775431" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.502712 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.708211 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.713625 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.721266 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.721481 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.721611 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.722220 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-f4hvs" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.731222 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.748201 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmhrb\" (UniqueName: \"kubernetes.io/projected/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-kube-api-access-rmhrb\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.748247 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.748263 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.748310 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.748402 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.748436 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.748476 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.849657 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.849726 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.849796 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.849820 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmhrb\" (UniqueName: \"kubernetes.io/projected/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-kube-api-access-rmhrb\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.849851 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.849872 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.849921 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.850020 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.850527 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: E0310 22:11:01.850837 4919 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 22:11:01 crc kubenswrapper[4919]: E0310 22:11:01.850856 4919 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 22:11:01 crc kubenswrapper[4919]: E0310 22:11:01.850887 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift podName:91c8bbf6-8824-4e21-a491-86f2f657549a nodeName:}" failed. No retries permitted until 2026-03-10 22:11:05.850876301 +0000 UTC m=+1253.092756909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift") pod "swift-storage-0" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a") : configmap "swift-ring-files" not found Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.851025 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.854818 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.856776 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.859027 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.859526 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:01 crc kubenswrapper[4919]: I0310 22:11:01.868205 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmhrb\" (UniqueName: \"kubernetes.io/projected/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-kube-api-access-rmhrb\") pod \"ovn-northd-0\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " pod="openstack/ovn-northd-0" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.064106 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.198444 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.254837 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-dns-svc\") pod \"474d4be4-538a-4463-a73c-68bbf7999276\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.255700 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-ovsdbserver-nb\") pod \"474d4be4-538a-4463-a73c-68bbf7999276\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.255780 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52hn7\" (UniqueName: \"kubernetes.io/projected/474d4be4-538a-4463-a73c-68bbf7999276-kube-api-access-52hn7\") pod \"474d4be4-538a-4463-a73c-68bbf7999276\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.255925 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-config\") pod \"474d4be4-538a-4463-a73c-68bbf7999276\" (UID: \"474d4be4-538a-4463-a73c-68bbf7999276\") " Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.258589 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.258631 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.261303 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474d4be4-538a-4463-a73c-68bbf7999276-kube-api-access-52hn7" (OuterVolumeSpecName: "kube-api-access-52hn7") pod "474d4be4-538a-4463-a73c-68bbf7999276" (UID: "474d4be4-538a-4463-a73c-68bbf7999276"). InnerVolumeSpecName "kube-api-access-52hn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.276978 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-config" (OuterVolumeSpecName: "config") pod "474d4be4-538a-4463-a73c-68bbf7999276" (UID: "474d4be4-538a-4463-a73c-68bbf7999276"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.280643 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "474d4be4-538a-4463-a73c-68bbf7999276" (UID: "474d4be4-538a-4463-a73c-68bbf7999276"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.283083 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "474d4be4-538a-4463-a73c-68bbf7999276" (UID: "474d4be4-538a-4463-a73c-68bbf7999276"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.358062 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.358102 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.358113 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52hn7\" (UniqueName: \"kubernetes.io/projected/474d4be4-538a-4463-a73c-68bbf7999276-kube-api-access-52hn7\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.358122 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d4be4-538a-4463-a73c-68bbf7999276-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.443053 4919 generic.go:334] "Generic (PLEG): container finished" podID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerID="6f95126f36aeff559b6794298af240695704556b3b7aeebc326d0171c3e41fa0" exitCode=0 Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.443128 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" event={"ID":"0fdc6e48-f4e3-4c70-8cab-78ad58edd483","Type":"ContainerDied","Data":"6f95126f36aeff559b6794298af240695704556b3b7aeebc326d0171c3e41fa0"} Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.445919 4919 generic.go:334] "Generic (PLEG): container finished" podID="021a8737-7317-441c-835f-e9cdc78ad6c4" containerID="8032a1de5b9b38c6b121f564d0062bf5450157445837b30318ebc77c5d1c9240" exitCode=0 Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.445961 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" event={"ID":"021a8737-7317-441c-835f-e9cdc78ad6c4","Type":"ContainerDied","Data":"8032a1de5b9b38c6b121f564d0062bf5450157445837b30318ebc77c5d1c9240"} Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.448130 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" event={"ID":"474d4be4-538a-4463-a73c-68bbf7999276","Type":"ContainerDied","Data":"e230c38942f56d291a1e30b0b2c61bfa432de26c666f390fb081b9033ade6744"} Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.448168 4919 scope.go:117] "RemoveContainer" containerID="0bcca0341a220b7ebd9d7f86a75a36ef6571beed14498faa8fa899702ee01901" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.449782 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdc8d4b69-2c5p8" Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.512497 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdc8d4b69-2c5p8"] Mar 10 22:11:02 crc kubenswrapper[4919]: I0310 22:11:02.517955 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdc8d4b69-2c5p8"] Mar 10 22:11:03 crc kubenswrapper[4919]: I0310 22:11:03.492004 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474d4be4-538a-4463-a73c-68bbf7999276" path="/var/lib/kubelet/pods/474d4be4-538a-4463-a73c-68bbf7999276/volumes" Mar 10 22:11:03 crc kubenswrapper[4919]: I0310 22:11:03.535475 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 22:11:03 crc kubenswrapper[4919]: I0310 22:11:03.535518 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 22:11:03 crc kubenswrapper[4919]: I0310 22:11:03.616385 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 22:11:03 crc kubenswrapper[4919]: I0310 22:11:03.927964 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.098746 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-dns-svc\") pod \"021a8737-7317-441c-835f-e9cdc78ad6c4\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.098929 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-config\") pod \"021a8737-7317-441c-835f-e9cdc78ad6c4\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.098956 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wwt2\" (UniqueName: \"kubernetes.io/projected/021a8737-7317-441c-835f-e9cdc78ad6c4-kube-api-access-2wwt2\") pod \"021a8737-7317-441c-835f-e9cdc78ad6c4\" (UID: \"021a8737-7317-441c-835f-e9cdc78ad6c4\") " Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.104305 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/021a8737-7317-441c-835f-e9cdc78ad6c4-kube-api-access-2wwt2" (OuterVolumeSpecName: "kube-api-access-2wwt2") pod "021a8737-7317-441c-835f-e9cdc78ad6c4" (UID: "021a8737-7317-441c-835f-e9cdc78ad6c4"). InnerVolumeSpecName "kube-api-access-2wwt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.134616 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "021a8737-7317-441c-835f-e9cdc78ad6c4" (UID: "021a8737-7317-441c-835f-e9cdc78ad6c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.138013 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-config" (OuterVolumeSpecName: "config") pod "021a8737-7317-441c-835f-e9cdc78ad6c4" (UID: "021a8737-7317-441c-835f-e9cdc78ad6c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.202116 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.202171 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wwt2\" (UniqueName: \"kubernetes.io/projected/021a8737-7317-441c-835f-e9cdc78ad6c4-kube-api-access-2wwt2\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.202192 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/021a8737-7317-441c-835f-e9cdc78ad6c4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.254523 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 22:11:04 crc kubenswrapper[4919]: W0310 22:11:04.256830 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4c91bdf_1945_4c8a_ab05_3b619e4bdb2b.slice/crio-7eb0e5753f33d6c038d98c271a29b6f581084fd55a480e34dff0747a3ff91a53 WatchSource:0}: Error finding container 7eb0e5753f33d6c038d98c271a29b6f581084fd55a480e34dff0747a3ff91a53: Status 404 returned error can't find the container with id 7eb0e5753f33d6c038d98c271a29b6f581084fd55a480e34dff0747a3ff91a53 Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.472779 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" event={"ID":"0fdc6e48-f4e3-4c70-8cab-78ad58edd483","Type":"ContainerStarted","Data":"cb2bc5565b179fcfd14f52b85ea1c6e89021235076d927650d687bdcde2adaa2"} Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.473694 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.475422 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" event={"ID":"021a8737-7317-441c-835f-e9cdc78ad6c4","Type":"ContainerDied","Data":"a8e31304a4028fc19fbac1eae7fae1eef5c1c788e8b71d0a61c916babc928ef8"} Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.475460 4919 scope.go:117] "RemoveContainer" containerID="8032a1de5b9b38c6b121f564d0062bf5450157445837b30318ebc77c5d1c9240" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.475545 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df9cc56b7-zv8nm" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.485287 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jm7n6" event={"ID":"0067a7fe-f5db-4832-a519-848ac8b771c0","Type":"ContainerStarted","Data":"99ab815c089630a419af1f8eee02aa01e609ba044e18bf1dea4c234c8e02a57b"} Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.486602 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b","Type":"ContainerStarted","Data":"7eb0e5753f33d6c038d98c271a29b6f581084fd55a480e34dff0747a3ff91a53"} Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.498619 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" podStartSLOduration=4.498598187 podStartE2EDuration="4.498598187s" podCreationTimestamp="2026-03-10 22:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:04.49608456 +0000 UTC m=+1251.737965178" watchObservedRunningTime="2026-03-10 22:11:04.498598187 +0000 UTC m=+1251.740478825" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.515010 4919 scope.go:117] "RemoveContainer" containerID="4ca47abb5626cac1353b95fccc6a37d01a6f5ecaabb850ee8c399099faf42f23" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.531888 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jm7n6" podStartSLOduration=1.95739974 podStartE2EDuration="6.531862109s" podCreationTimestamp="2026-03-10 22:10:58 +0000 UTC" firstStartedPulling="2026-03-10 22:10:59.260432206 +0000 UTC m=+1246.502312814" lastFinishedPulling="2026-03-10 22:11:03.834894575 +0000 UTC m=+1251.076775183" observedRunningTime="2026-03-10 22:11:04.520708367 +0000 UTC m=+1251.762588975" watchObservedRunningTime="2026-03-10 22:11:04.531862109 +0000 UTC m=+1251.773742727" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.543865 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df9cc56b7-zv8nm"] Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.552090 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7df9cc56b7-zv8nm"] Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.582136 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.843932 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 22:11:04 crc kubenswrapper[4919]: I0310 22:11:04.957799 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 22:11:05 crc kubenswrapper[4919]: I0310 22:11:05.498708 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="021a8737-7317-441c-835f-e9cdc78ad6c4" path="/var/lib/kubelet/pods/021a8737-7317-441c-835f-e9cdc78ad6c4/volumes" Mar 10 22:11:05 crc kubenswrapper[4919]: I0310 22:11:05.941319 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:11:05 crc kubenswrapper[4919]: E0310 22:11:05.941672 4919 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 22:11:05 crc kubenswrapper[4919]: E0310 22:11:05.941701 4919 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 22:11:05 crc kubenswrapper[4919]: E0310 22:11:05.941751 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift podName:91c8bbf6-8824-4e21-a491-86f2f657549a nodeName:}" failed. No retries permitted until 2026-03-10 22:11:13.941734627 +0000 UTC m=+1261.183615255 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift") pod "swift-storage-0" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a") : configmap "swift-ring-files" not found Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.423287 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bbb7-account-create-update-tn746"] Mar 10 22:11:06 crc kubenswrapper[4919]: E0310 22:11:06.424038 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021a8737-7317-441c-835f-e9cdc78ad6c4" containerName="dnsmasq-dns" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.424062 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="021a8737-7317-441c-835f-e9cdc78ad6c4" containerName="dnsmasq-dns" Mar 10 22:11:06 crc kubenswrapper[4919]: E0310 22:11:06.424108 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021a8737-7317-441c-835f-e9cdc78ad6c4" containerName="init" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.424118 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="021a8737-7317-441c-835f-e9cdc78ad6c4" containerName="init" Mar 10 22:11:06 crc kubenswrapper[4919]: E0310 22:11:06.424133 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474d4be4-538a-4463-a73c-68bbf7999276" containerName="init" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.424142 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="474d4be4-538a-4463-a73c-68bbf7999276" containerName="init" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.424305 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="474d4be4-538a-4463-a73c-68bbf7999276" containerName="init" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.424327 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="021a8737-7317-441c-835f-e9cdc78ad6c4" containerName="dnsmasq-dns" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.424981 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.427005 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.435716 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bbb7-account-create-update-tn746"] Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.489479 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-operator-scripts\") pod \"keystone-bbb7-account-create-update-tn746\" (UID: \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\") " pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.489881 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4sdh\" (UniqueName: \"kubernetes.io/projected/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-kube-api-access-x4sdh\") pod \"keystone-bbb7-account-create-update-tn746\" (UID: \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\") " pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.493376 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-h94vd"] Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.494739 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.536164 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b","Type":"ContainerStarted","Data":"95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b"} Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.536221 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-h94vd"] Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.537657 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b","Type":"ContainerStarted","Data":"b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a"} Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.537705 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.564566 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-j6dcn"] Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.565674 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.572999 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-j6dcn"] Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.582041 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5e3c-account-create-update-m2swx"] Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.583034 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.587946 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.591409 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8fqc\" (UniqueName: \"kubernetes.io/projected/f60753e3-36e6-4155-8fcd-7460f2803ea4-kube-api-access-l8fqc\") pod \"keystone-db-create-h94vd\" (UID: \"f60753e3-36e6-4155-8fcd-7460f2803ea4\") " pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.591458 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-operator-scripts\") pod \"keystone-bbb7-account-create-update-tn746\" (UID: \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\") " pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.591477 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvkb9\" (UniqueName: \"kubernetes.io/projected/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-kube-api-access-xvkb9\") pod \"placement-5e3c-account-create-update-m2swx\" (UID: \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\") " pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.591536 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d007fb-1bda-48dd-ad03-6601dc770a2a-operator-scripts\") pod \"placement-db-create-j6dcn\" (UID: \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\") " pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.591561 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f60753e3-36e6-4155-8fcd-7460f2803ea4-operator-scripts\") pod \"keystone-db-create-h94vd\" (UID: \"f60753e3-36e6-4155-8fcd-7460f2803ea4\") " pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.591589 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-operator-scripts\") pod \"placement-5e3c-account-create-update-m2swx\" (UID: \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\") " pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.591619 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r47d\" (UniqueName: \"kubernetes.io/projected/a4d007fb-1bda-48dd-ad03-6601dc770a2a-kube-api-access-5r47d\") pod \"placement-db-create-j6dcn\" (UID: \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\") " pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.591643 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4sdh\" (UniqueName: \"kubernetes.io/projected/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-kube-api-access-x4sdh\") pod \"keystone-bbb7-account-create-update-tn746\" (UID: \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\") " pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.592016 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5e3c-account-create-update-m2swx"] Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.592667 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-operator-scripts\") pod \"keystone-bbb7-account-create-update-tn746\" (UID: \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\") " pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.594125 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.93975963 podStartE2EDuration="5.594107253s" podCreationTimestamp="2026-03-10 22:11:01 +0000 UTC" firstStartedPulling="2026-03-10 22:11:04.25877355 +0000 UTC m=+1251.500654158" lastFinishedPulling="2026-03-10 22:11:05.913121173 +0000 UTC m=+1253.155001781" observedRunningTime="2026-03-10 22:11:06.558721694 +0000 UTC m=+1253.800602312" watchObservedRunningTime="2026-03-10 22:11:06.594107253 +0000 UTC m=+1253.835987861" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.610952 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4sdh\" (UniqueName: \"kubernetes.io/projected/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-kube-api-access-x4sdh\") pod \"keystone-bbb7-account-create-update-tn746\" (UID: \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\") " pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.692973 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d007fb-1bda-48dd-ad03-6601dc770a2a-operator-scripts\") pod \"placement-db-create-j6dcn\" (UID: \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\") " pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.693023 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f60753e3-36e6-4155-8fcd-7460f2803ea4-operator-scripts\") pod \"keystone-db-create-h94vd\" (UID: \"f60753e3-36e6-4155-8fcd-7460f2803ea4\") " pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.693045 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-operator-scripts\") pod \"placement-5e3c-account-create-update-m2swx\" (UID: \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\") " pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.693086 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r47d\" (UniqueName: \"kubernetes.io/projected/a4d007fb-1bda-48dd-ad03-6601dc770a2a-kube-api-access-5r47d\") pod \"placement-db-create-j6dcn\" (UID: \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\") " pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.693185 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8fqc\" (UniqueName: \"kubernetes.io/projected/f60753e3-36e6-4155-8fcd-7460f2803ea4-kube-api-access-l8fqc\") pod \"keystone-db-create-h94vd\" (UID: \"f60753e3-36e6-4155-8fcd-7460f2803ea4\") " pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.693244 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvkb9\" (UniqueName: \"kubernetes.io/projected/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-kube-api-access-xvkb9\") pod \"placement-5e3c-account-create-update-m2swx\" (UID: \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\") " pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.693993 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d007fb-1bda-48dd-ad03-6601dc770a2a-operator-scripts\") pod \"placement-db-create-j6dcn\" (UID: \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\") " pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.694040 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-operator-scripts\") pod \"placement-5e3c-account-create-update-m2swx\" (UID: \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\") " pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.694006 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f60753e3-36e6-4155-8fcd-7460f2803ea4-operator-scripts\") pod \"keystone-db-create-h94vd\" (UID: \"f60753e3-36e6-4155-8fcd-7460f2803ea4\") " pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.715910 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r47d\" (UniqueName: \"kubernetes.io/projected/a4d007fb-1bda-48dd-ad03-6601dc770a2a-kube-api-access-5r47d\") pod \"placement-db-create-j6dcn\" (UID: \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\") " pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.716149 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8fqc\" (UniqueName: \"kubernetes.io/projected/f60753e3-36e6-4155-8fcd-7460f2803ea4-kube-api-access-l8fqc\") pod \"keystone-db-create-h94vd\" (UID: \"f60753e3-36e6-4155-8fcd-7460f2803ea4\") " pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.716601 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvkb9\" (UniqueName: \"kubernetes.io/projected/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-kube-api-access-xvkb9\") pod \"placement-5e3c-account-create-update-m2swx\" (UID: \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\") " pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.744422 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.828265 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.889061 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:06 crc kubenswrapper[4919]: I0310 22:11:06.906831 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.169346 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bbb7-account-create-update-tn746"] Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.262130 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-h94vd"] Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.379910 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5e3c-account-create-update-m2swx"] Mar 10 22:11:07 crc kubenswrapper[4919]: W0310 22:11:07.463568 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4d007fb_1bda_48dd_ad03_6601dc770a2a.slice/crio-363f6819d085526eb1a39f2aafa015665debf0fa62d7f430ee0dc8eb5c76dc3d WatchSource:0}: Error finding container 363f6819d085526eb1a39f2aafa015665debf0fa62d7f430ee0dc8eb5c76dc3d: Status 404 returned error can't find the container with id 363f6819d085526eb1a39f2aafa015665debf0fa62d7f430ee0dc8eb5c76dc3d Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.464496 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-j6dcn"] Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.552943 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h94vd" event={"ID":"f60753e3-36e6-4155-8fcd-7460f2803ea4","Type":"ContainerStarted","Data":"6348edd095c11b2dfbe689495aad97ef5ace707e77fe77df248c5adecd0d8c19"} Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.552987 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h94vd" event={"ID":"f60753e3-36e6-4155-8fcd-7460f2803ea4","Type":"ContainerStarted","Data":"a63918678542760553737a88347611e93c4cf5c24d81e1a86a869955014a19e4"} Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.554315 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bbb7-account-create-update-tn746" event={"ID":"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca","Type":"ContainerStarted","Data":"05a129c14cde6b4a62a3b17eec8fd6a1dbb19c5c9e8518261a398182a085ee06"} Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.554358 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bbb7-account-create-update-tn746" event={"ID":"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca","Type":"ContainerStarted","Data":"cf1d035627101a79d6f7a4713cd5e41be16fe6512d85f70ecd76c985a819ae43"} Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.561353 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5e3c-account-create-update-m2swx" event={"ID":"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f","Type":"ContainerStarted","Data":"b2fdb3586f051235e67dbb934d3c6c9b5f0c26c39b2ae5622c9ee6f8f6dee668"} Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.564881 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j6dcn" event={"ID":"a4d007fb-1bda-48dd-ad03-6601dc770a2a","Type":"ContainerStarted","Data":"363f6819d085526eb1a39f2aafa015665debf0fa62d7f430ee0dc8eb5c76dc3d"} Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.578705 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-h94vd" podStartSLOduration=1.578680268 podStartE2EDuration="1.578680268s" podCreationTimestamp="2026-03-10 22:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:07.567016883 +0000 UTC m=+1254.808897501" watchObservedRunningTime="2026-03-10 22:11:07.578680268 +0000 UTC m=+1254.820560866" Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.604599 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bbb7-account-create-update-tn746" podStartSLOduration=1.604578231 podStartE2EDuration="1.604578231s" podCreationTimestamp="2026-03-10 22:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:07.599112482 +0000 UTC m=+1254.840993080" watchObservedRunningTime="2026-03-10 22:11:07.604578231 +0000 UTC m=+1254.846458839" Mar 10 22:11:07 crc kubenswrapper[4919]: I0310 22:11:07.605775 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5e3c-account-create-update-m2swx" podStartSLOduration=1.6057671230000001 podStartE2EDuration="1.605767123s" podCreationTimestamp="2026-03-10 22:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:07.584231349 +0000 UTC m=+1254.826111957" watchObservedRunningTime="2026-03-10 22:11:07.605767123 +0000 UTC m=+1254.847647731" Mar 10 22:11:08 crc kubenswrapper[4919]: I0310 22:11:08.577453 4919 generic.go:334] "Generic (PLEG): container finished" podID="3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca" containerID="05a129c14cde6b4a62a3b17eec8fd6a1dbb19c5c9e8518261a398182a085ee06" exitCode=0 Mar 10 22:11:08 crc kubenswrapper[4919]: I0310 22:11:08.577828 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bbb7-account-create-update-tn746" event={"ID":"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca","Type":"ContainerDied","Data":"05a129c14cde6b4a62a3b17eec8fd6a1dbb19c5c9e8518261a398182a085ee06"} Mar 10 22:11:08 crc kubenswrapper[4919]: I0310 22:11:08.580649 4919 generic.go:334] "Generic (PLEG): container finished" podID="9d920fbb-265a-4fbd-8a5b-02a8fdcf216f" containerID="fd7c8f0b306e73496827ddf94aa00307911195ef6e69e874d9d894cdee7e72a3" exitCode=0 Mar 10 22:11:08 crc kubenswrapper[4919]: I0310 22:11:08.580715 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5e3c-account-create-update-m2swx" event={"ID":"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f","Type":"ContainerDied","Data":"fd7c8f0b306e73496827ddf94aa00307911195ef6e69e874d9d894cdee7e72a3"} Mar 10 22:11:08 crc kubenswrapper[4919]: I0310 22:11:08.587010 4919 generic.go:334] "Generic (PLEG): container finished" podID="a4d007fb-1bda-48dd-ad03-6601dc770a2a" containerID="0bd792ea437cac42db6d52a44652671ab655e56b2f2eee19cc85a6060157c0a3" exitCode=0 Mar 10 22:11:08 crc kubenswrapper[4919]: I0310 22:11:08.587158 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j6dcn" event={"ID":"a4d007fb-1bda-48dd-ad03-6601dc770a2a","Type":"ContainerDied","Data":"0bd792ea437cac42db6d52a44652671ab655e56b2f2eee19cc85a6060157c0a3"} Mar 10 22:11:08 crc kubenswrapper[4919]: I0310 22:11:08.593116 4919 generic.go:334] "Generic (PLEG): container finished" podID="f60753e3-36e6-4155-8fcd-7460f2803ea4" containerID="6348edd095c11b2dfbe689495aad97ef5ace707e77fe77df248c5adecd0d8c19" exitCode=0 Mar 10 22:11:08 crc kubenswrapper[4919]: I0310 22:11:08.593297 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h94vd" event={"ID":"f60753e3-36e6-4155-8fcd-7460f2803ea4","Type":"ContainerDied","Data":"6348edd095c11b2dfbe689495aad97ef5ace707e77fe77df248c5adecd0d8c19"} Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.029256 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.150267 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-operator-scripts\") pod \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\" (UID: \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.150362 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4sdh\" (UniqueName: \"kubernetes.io/projected/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-kube-api-access-x4sdh\") pod \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\" (UID: \"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.151471 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca" (UID: "3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.156180 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-kube-api-access-x4sdh" (OuterVolumeSpecName: "kube-api-access-x4sdh") pod "3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca" (UID: "3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca"). InnerVolumeSpecName "kube-api-access-x4sdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.188059 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.196043 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.206522 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.253283 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.253314 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4sdh\" (UniqueName: \"kubernetes.io/projected/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca-kube-api-access-x4sdh\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.358064 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8fqc\" (UniqueName: \"kubernetes.io/projected/f60753e3-36e6-4155-8fcd-7460f2803ea4-kube-api-access-l8fqc\") pod \"f60753e3-36e6-4155-8fcd-7460f2803ea4\" (UID: \"f60753e3-36e6-4155-8fcd-7460f2803ea4\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.358156 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f60753e3-36e6-4155-8fcd-7460f2803ea4-operator-scripts\") pod \"f60753e3-36e6-4155-8fcd-7460f2803ea4\" (UID: \"f60753e3-36e6-4155-8fcd-7460f2803ea4\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.358227 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d007fb-1bda-48dd-ad03-6601dc770a2a-operator-scripts\") pod \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\" (UID: \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.358282 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-operator-scripts\") pod \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\" (UID: \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.358323 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r47d\" (UniqueName: \"kubernetes.io/projected/a4d007fb-1bda-48dd-ad03-6601dc770a2a-kube-api-access-5r47d\") pod \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\" (UID: \"a4d007fb-1bda-48dd-ad03-6601dc770a2a\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.358358 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvkb9\" (UniqueName: \"kubernetes.io/projected/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-kube-api-access-xvkb9\") pod \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\" (UID: \"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.358861 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4d007fb-1bda-48dd-ad03-6601dc770a2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4d007fb-1bda-48dd-ad03-6601dc770a2a" (UID: "a4d007fb-1bda-48dd-ad03-6601dc770a2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.359480 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d920fbb-265a-4fbd-8a5b-02a8fdcf216f" (UID: "9d920fbb-265a-4fbd-8a5b-02a8fdcf216f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.359818 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f60753e3-36e6-4155-8fcd-7460f2803ea4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f60753e3-36e6-4155-8fcd-7460f2803ea4" (UID: "f60753e3-36e6-4155-8fcd-7460f2803ea4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.363788 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f60753e3-36e6-4155-8fcd-7460f2803ea4-kube-api-access-l8fqc" (OuterVolumeSpecName: "kube-api-access-l8fqc") pod "f60753e3-36e6-4155-8fcd-7460f2803ea4" (UID: "f60753e3-36e6-4155-8fcd-7460f2803ea4"). InnerVolumeSpecName "kube-api-access-l8fqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.364269 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d007fb-1bda-48dd-ad03-6601dc770a2a-kube-api-access-5r47d" (OuterVolumeSpecName: "kube-api-access-5r47d") pod "a4d007fb-1bda-48dd-ad03-6601dc770a2a" (UID: "a4d007fb-1bda-48dd-ad03-6601dc770a2a"). InnerVolumeSpecName "kube-api-access-5r47d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.366218 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-kube-api-access-xvkb9" (OuterVolumeSpecName: "kube-api-access-xvkb9") pod "9d920fbb-265a-4fbd-8a5b-02a8fdcf216f" (UID: "9d920fbb-265a-4fbd-8a5b-02a8fdcf216f"). InnerVolumeSpecName "kube-api-access-xvkb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.454852 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.461308 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.461371 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r47d\" (UniqueName: \"kubernetes.io/projected/a4d007fb-1bda-48dd-ad03-6601dc770a2a-kube-api-access-5r47d\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.461402 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvkb9\" (UniqueName: \"kubernetes.io/projected/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f-kube-api-access-xvkb9\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.461420 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8fqc\" (UniqueName: \"kubernetes.io/projected/f60753e3-36e6-4155-8fcd-7460f2803ea4-kube-api-access-l8fqc\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.461433 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f60753e3-36e6-4155-8fcd-7460f2803ea4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.461446 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4d007fb-1bda-48dd-ad03-6601dc770a2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.511126 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-jmrlb"] Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.511410 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" podUID="4a25dadc-79b5-4535-a9b2-92a9b184119c" containerName="dnsmasq-dns" containerID="cri-o://f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea" gracePeriod=10 Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.610741 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5e3c-account-create-update-m2swx" event={"ID":"9d920fbb-265a-4fbd-8a5b-02a8fdcf216f","Type":"ContainerDied","Data":"b2fdb3586f051235e67dbb934d3c6c9b5f0c26c39b2ae5622c9ee6f8f6dee668"} Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.611048 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2fdb3586f051235e67dbb934d3c6c9b5f0c26c39b2ae5622c9ee6f8f6dee668" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.610791 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5e3c-account-create-update-m2swx" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.612052 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j6dcn" event={"ID":"a4d007fb-1bda-48dd-ad03-6601dc770a2a","Type":"ContainerDied","Data":"363f6819d085526eb1a39f2aafa015665debf0fa62d7f430ee0dc8eb5c76dc3d"} Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.612076 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="363f6819d085526eb1a39f2aafa015665debf0fa62d7f430ee0dc8eb5c76dc3d" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.612119 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j6dcn" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.625831 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h94vd" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.625835 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h94vd" event={"ID":"f60753e3-36e6-4155-8fcd-7460f2803ea4","Type":"ContainerDied","Data":"a63918678542760553737a88347611e93c4cf5c24d81e1a86a869955014a19e4"} Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.625872 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a63918678542760553737a88347611e93c4cf5c24d81e1a86a869955014a19e4" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.627820 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bbb7-account-create-update-tn746" event={"ID":"3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca","Type":"ContainerDied","Data":"cf1d035627101a79d6f7a4713cd5e41be16fe6512d85f70ecd76c985a819ae43"} Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.627862 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf1d035627101a79d6f7a4713cd5e41be16fe6512d85f70ecd76c985a819ae43" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.627926 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bbb7-account-create-update-tn746" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.883706 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-sdm6h"] Mar 10 22:11:10 crc kubenswrapper[4919]: E0310 22:11:10.884110 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca" containerName="mariadb-account-create-update" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884130 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca" containerName="mariadb-account-create-update" Mar 10 22:11:10 crc kubenswrapper[4919]: E0310 22:11:10.884143 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f60753e3-36e6-4155-8fcd-7460f2803ea4" containerName="mariadb-database-create" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884149 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f60753e3-36e6-4155-8fcd-7460f2803ea4" containerName="mariadb-database-create" Mar 10 22:11:10 crc kubenswrapper[4919]: E0310 22:11:10.884166 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d007fb-1bda-48dd-ad03-6601dc770a2a" containerName="mariadb-database-create" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884173 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d007fb-1bda-48dd-ad03-6601dc770a2a" containerName="mariadb-database-create" Mar 10 22:11:10 crc kubenswrapper[4919]: E0310 22:11:10.884195 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d920fbb-265a-4fbd-8a5b-02a8fdcf216f" containerName="mariadb-account-create-update" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884201 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d920fbb-265a-4fbd-8a5b-02a8fdcf216f" containerName="mariadb-account-create-update" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884367 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f60753e3-36e6-4155-8fcd-7460f2803ea4" containerName="mariadb-database-create" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884414 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca" containerName="mariadb-account-create-update" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884426 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d007fb-1bda-48dd-ad03-6601dc770a2a" containerName="mariadb-database-create" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884434 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d920fbb-265a-4fbd-8a5b-02a8fdcf216f" containerName="mariadb-account-create-update" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.884984 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.887977 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.896795 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sdm6h"] Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.930564 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.968088 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-dns-svc\") pod \"4a25dadc-79b5-4535-a9b2-92a9b184119c\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.968199 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-config\") pod \"4a25dadc-79b5-4535-a9b2-92a9b184119c\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.968254 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xslp5\" (UniqueName: \"kubernetes.io/projected/4a25dadc-79b5-4535-a9b2-92a9b184119c-kube-api-access-xslp5\") pod \"4a25dadc-79b5-4535-a9b2-92a9b184119c\" (UID: \"4a25dadc-79b5-4535-a9b2-92a9b184119c\") " Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.968969 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxstx\" (UniqueName: \"kubernetes.io/projected/cce777ff-58ad-40ac-83cd-d8a9993d77b3-kube-api-access-jxstx\") pod \"root-account-create-update-sdm6h\" (UID: \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\") " pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.969109 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce777ff-58ad-40ac-83cd-d8a9993d77b3-operator-scripts\") pod \"root-account-create-update-sdm6h\" (UID: \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\") " pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:10 crc kubenswrapper[4919]: I0310 22:11:10.973017 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a25dadc-79b5-4535-a9b2-92a9b184119c-kube-api-access-xslp5" (OuterVolumeSpecName: "kube-api-access-xslp5") pod "4a25dadc-79b5-4535-a9b2-92a9b184119c" (UID: "4a25dadc-79b5-4535-a9b2-92a9b184119c"). InnerVolumeSpecName "kube-api-access-xslp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.010491 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a25dadc-79b5-4535-a9b2-92a9b184119c" (UID: "4a25dadc-79b5-4535-a9b2-92a9b184119c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.012662 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-config" (OuterVolumeSpecName: "config") pod "4a25dadc-79b5-4535-a9b2-92a9b184119c" (UID: "4a25dadc-79b5-4535-a9b2-92a9b184119c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.069806 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce777ff-58ad-40ac-83cd-d8a9993d77b3-operator-scripts\") pod \"root-account-create-update-sdm6h\" (UID: \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\") " pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.069860 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxstx\" (UniqueName: \"kubernetes.io/projected/cce777ff-58ad-40ac-83cd-d8a9993d77b3-kube-api-access-jxstx\") pod \"root-account-create-update-sdm6h\" (UID: \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\") " pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.069994 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.070012 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a25dadc-79b5-4535-a9b2-92a9b184119c-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.070021 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xslp5\" (UniqueName: \"kubernetes.io/projected/4a25dadc-79b5-4535-a9b2-92a9b184119c-kube-api-access-xslp5\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.070623 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce777ff-58ad-40ac-83cd-d8a9993d77b3-operator-scripts\") pod \"root-account-create-update-sdm6h\" (UID: \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\") " pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.097867 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxstx\" (UniqueName: \"kubernetes.io/projected/cce777ff-58ad-40ac-83cd-d8a9993d77b3-kube-api-access-jxstx\") pod \"root-account-create-update-sdm6h\" (UID: \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\") " pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.207025 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.613059 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sdm6h"] Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.635498 4919 generic.go:334] "Generic (PLEG): container finished" podID="4a25dadc-79b5-4535-a9b2-92a9b184119c" containerID="f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea" exitCode=0 Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.635554 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.635570 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" event={"ID":"4a25dadc-79b5-4535-a9b2-92a9b184119c","Type":"ContainerDied","Data":"f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea"} Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.635598 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-jmrlb" event={"ID":"4a25dadc-79b5-4535-a9b2-92a9b184119c","Type":"ContainerDied","Data":"cfa8aa21c47f0d6277c154940903ea74991445112432f667004c73d07e790aa0"} Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.635614 4919 scope.go:117] "RemoveContainer" containerID="f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.637943 4919 generic.go:334] "Generic (PLEG): container finished" podID="0067a7fe-f5db-4832-a519-848ac8b771c0" containerID="99ab815c089630a419af1f8eee02aa01e609ba044e18bf1dea4c234c8e02a57b" exitCode=0 Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.637999 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jm7n6" event={"ID":"0067a7fe-f5db-4832-a519-848ac8b771c0","Type":"ContainerDied","Data":"99ab815c089630a419af1f8eee02aa01e609ba044e18bf1dea4c234c8e02a57b"} Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.642435 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sdm6h" event={"ID":"cce777ff-58ad-40ac-83cd-d8a9993d77b3","Type":"ContainerStarted","Data":"c345cae823837ade4cb5b9c4b9cf1aa1de35b95e4ddbd47237073e0843bb8d17"} Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.660191 4919 scope.go:117] "RemoveContainer" containerID="eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.684873 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-jmrlb"] Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.688835 4919 scope.go:117] "RemoveContainer" containerID="f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea" Mar 10 22:11:11 crc kubenswrapper[4919]: E0310 22:11:11.689320 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea\": container with ID starting with f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea not found: ID does not exist" containerID="f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.689360 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea"} err="failed to get container status \"f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea\": rpc error: code = NotFound desc = could not find container \"f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea\": container with ID starting with f8d6edf939a388528ddce5c41e055ae0f34219e27be8043c3bf3739f6e3bf5ea not found: ID does not exist" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.689380 4919 scope.go:117] "RemoveContainer" containerID="eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b" Mar 10 22:11:11 crc kubenswrapper[4919]: E0310 22:11:11.690164 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b\": container with ID starting with eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b not found: ID does not exist" containerID="eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.690182 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b"} err="failed to get container status \"eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b\": rpc error: code = NotFound desc = could not find container \"eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b\": container with ID starting with eef2bfb96f5001b5a4f0a48dc8ac5642b4cf3ba5ea4c38597d2216a8321e012b not found: ID does not exist" Mar 10 22:11:11 crc kubenswrapper[4919]: I0310 22:11:11.691187 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-jmrlb"] Mar 10 22:11:12 crc kubenswrapper[4919]: I0310 22:11:12.654288 4919 generic.go:334] "Generic (PLEG): container finished" podID="cce777ff-58ad-40ac-83cd-d8a9993d77b3" containerID="24ec58e2a80c23809125a023c1cce17ddcc1f52fe9b8a224d0e7dd4ad8e312a9" exitCode=0 Mar 10 22:11:12 crc kubenswrapper[4919]: I0310 22:11:12.654361 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sdm6h" event={"ID":"cce777ff-58ad-40ac-83cd-d8a9993d77b3","Type":"ContainerDied","Data":"24ec58e2a80c23809125a023c1cce17ddcc1f52fe9b8a224d0e7dd4ad8e312a9"} Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.200425 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.216696 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-combined-ca-bundle\") pod \"0067a7fe-f5db-4832-a519-848ac8b771c0\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.216813 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-scripts\") pod \"0067a7fe-f5db-4832-a519-848ac8b771c0\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.216876 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-swiftconf\") pod \"0067a7fe-f5db-4832-a519-848ac8b771c0\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.217121 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-dispersionconf\") pod \"0067a7fe-f5db-4832-a519-848ac8b771c0\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.217152 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0067a7fe-f5db-4832-a519-848ac8b771c0-etc-swift\") pod \"0067a7fe-f5db-4832-a519-848ac8b771c0\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.217174 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-ring-data-devices\") pod \"0067a7fe-f5db-4832-a519-848ac8b771c0\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.217216 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrfgj\" (UniqueName: \"kubernetes.io/projected/0067a7fe-f5db-4832-a519-848ac8b771c0-kube-api-access-vrfgj\") pod \"0067a7fe-f5db-4832-a519-848ac8b771c0\" (UID: \"0067a7fe-f5db-4832-a519-848ac8b771c0\") " Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.219676 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0067a7fe-f5db-4832-a519-848ac8b771c0" (UID: "0067a7fe-f5db-4832-a519-848ac8b771c0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.220695 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0067a7fe-f5db-4832-a519-848ac8b771c0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0067a7fe-f5db-4832-a519-848ac8b771c0" (UID: "0067a7fe-f5db-4832-a519-848ac8b771c0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.225203 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0067a7fe-f5db-4832-a519-848ac8b771c0-kube-api-access-vrfgj" (OuterVolumeSpecName: "kube-api-access-vrfgj") pod "0067a7fe-f5db-4832-a519-848ac8b771c0" (UID: "0067a7fe-f5db-4832-a519-848ac8b771c0"). InnerVolumeSpecName "kube-api-access-vrfgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.234066 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0067a7fe-f5db-4832-a519-848ac8b771c0" (UID: "0067a7fe-f5db-4832-a519-848ac8b771c0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.250805 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0067a7fe-f5db-4832-a519-848ac8b771c0" (UID: "0067a7fe-f5db-4832-a519-848ac8b771c0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.264221 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-scripts" (OuterVolumeSpecName: "scripts") pod "0067a7fe-f5db-4832-a519-848ac8b771c0" (UID: "0067a7fe-f5db-4832-a519-848ac8b771c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.268233 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0067a7fe-f5db-4832-a519-848ac8b771c0" (UID: "0067a7fe-f5db-4832-a519-848ac8b771c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.320719 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.320805 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.320827 4919 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.320845 4919 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0067a7fe-f5db-4832-a519-848ac8b771c0-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.320862 4919 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0067a7fe-f5db-4832-a519-848ac8b771c0-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.320881 4919 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0067a7fe-f5db-4832-a519-848ac8b771c0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.320973 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrfgj\" (UniqueName: \"kubernetes.io/projected/0067a7fe-f5db-4832-a519-848ac8b771c0-kube-api-access-vrfgj\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.523626 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a25dadc-79b5-4535-a9b2-92a9b184119c" path="/var/lib/kubelet/pods/4a25dadc-79b5-4535-a9b2-92a9b184119c/volumes" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.664418 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jm7n6" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.664430 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jm7n6" event={"ID":"0067a7fe-f5db-4832-a519-848ac8b771c0","Type":"ContainerDied","Data":"09942638125ea4ed026cc395ea628cc10577f4a19c9585458a93c22d51766da8"} Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.664488 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09942638125ea4ed026cc395ea628cc10577f4a19c9585458a93c22d51766da8" Mar 10 22:11:13 crc kubenswrapper[4919]: I0310 22:11:13.938765 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.037249 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxstx\" (UniqueName: \"kubernetes.io/projected/cce777ff-58ad-40ac-83cd-d8a9993d77b3-kube-api-access-jxstx\") pod \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\" (UID: \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\") " Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.037350 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce777ff-58ad-40ac-83cd-d8a9993d77b3-operator-scripts\") pod \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\" (UID: \"cce777ff-58ad-40ac-83cd-d8a9993d77b3\") " Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.037799 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.037905 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce777ff-58ad-40ac-83cd-d8a9993d77b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cce777ff-58ad-40ac-83cd-d8a9993d77b3" (UID: "cce777ff-58ad-40ac-83cd-d8a9993d77b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.038322 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cce777ff-58ad-40ac-83cd-d8a9993d77b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.043230 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce777ff-58ad-40ac-83cd-d8a9993d77b3-kube-api-access-jxstx" (OuterVolumeSpecName: "kube-api-access-jxstx") pod "cce777ff-58ad-40ac-83cd-d8a9993d77b3" (UID: "cce777ff-58ad-40ac-83cd-d8a9993d77b3"). InnerVolumeSpecName "kube-api-access-jxstx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.044232 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"swift-storage-0\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " pod="openstack/swift-storage-0" Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.139969 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxstx\" (UniqueName: \"kubernetes.io/projected/cce777ff-58ad-40ac-83cd-d8a9993d77b3-kube-api-access-jxstx\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:14 crc kubenswrapper[4919]: I0310 22:11:14.206787 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.680486 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sdm6h" event={"ID":"cce777ff-58ad-40ac-83cd-d8a9993d77b3","Type":"ContainerDied","Data":"c345cae823837ade4cb5b9c4b9cf1aa1de35b95e4ddbd47237073e0843bb8d17"} Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.680858 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sdm6h" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.682138 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c345cae823837ade4cb5b9c4b9cf1aa1de35b95e4ddbd47237073e0843bb8d17" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.692294 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4bjxd"] Mar 10 22:11:15 crc kubenswrapper[4919]: E0310 22:11:14.692804 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a25dadc-79b5-4535-a9b2-92a9b184119c" containerName="dnsmasq-dns" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.692821 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a25dadc-79b5-4535-a9b2-92a9b184119c" containerName="dnsmasq-dns" Mar 10 22:11:15 crc kubenswrapper[4919]: E0310 22:11:14.692844 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce777ff-58ad-40ac-83cd-d8a9993d77b3" containerName="mariadb-account-create-update" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.692906 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce777ff-58ad-40ac-83cd-d8a9993d77b3" containerName="mariadb-account-create-update" Mar 10 22:11:15 crc kubenswrapper[4919]: E0310 22:11:14.692920 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a25dadc-79b5-4535-a9b2-92a9b184119c" containerName="init" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.692929 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a25dadc-79b5-4535-a9b2-92a9b184119c" containerName="init" Mar 10 22:11:15 crc kubenswrapper[4919]: E0310 22:11:14.692945 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0067a7fe-f5db-4832-a519-848ac8b771c0" containerName="swift-ring-rebalance" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.692952 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0067a7fe-f5db-4832-a519-848ac8b771c0" containerName="swift-ring-rebalance" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.693123 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0067a7fe-f5db-4832-a519-848ac8b771c0" containerName="swift-ring-rebalance" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.693142 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce777ff-58ad-40ac-83cd-d8a9993d77b3" containerName="mariadb-account-create-update" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.693163 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a25dadc-79b5-4535-a9b2-92a9b184119c" containerName="dnsmasq-dns" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.693835 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.700027 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9e8c-account-create-update-prqsg"] Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.701080 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.703310 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.706555 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4bjxd"] Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.715642 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9e8c-account-create-update-prqsg"] Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.851971 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-operator-scripts\") pod \"glance-db-create-4bjxd\" (UID: \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\") " pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.852016 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbxvf\" (UniqueName: \"kubernetes.io/projected/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-kube-api-access-gbxvf\") pod \"glance-db-create-4bjxd\" (UID: \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\") " pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.852061 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8662d67d-6dbb-4156-8a34-a13e650bb745-operator-scripts\") pod \"glance-9e8c-account-create-update-prqsg\" (UID: \"8662d67d-6dbb-4156-8a34-a13e650bb745\") " pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.852090 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmms\" (UniqueName: \"kubernetes.io/projected/8662d67d-6dbb-4156-8a34-a13e650bb745-kube-api-access-hkmms\") pod \"glance-9e8c-account-create-update-prqsg\" (UID: \"8662d67d-6dbb-4156-8a34-a13e650bb745\") " pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.953981 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-operator-scripts\") pod \"glance-db-create-4bjxd\" (UID: \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\") " pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.954083 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbxvf\" (UniqueName: \"kubernetes.io/projected/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-kube-api-access-gbxvf\") pod \"glance-db-create-4bjxd\" (UID: \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\") " pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.954176 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8662d67d-6dbb-4156-8a34-a13e650bb745-operator-scripts\") pod \"glance-9e8c-account-create-update-prqsg\" (UID: \"8662d67d-6dbb-4156-8a34-a13e650bb745\") " pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.954251 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmms\" (UniqueName: \"kubernetes.io/projected/8662d67d-6dbb-4156-8a34-a13e650bb745-kube-api-access-hkmms\") pod \"glance-9e8c-account-create-update-prqsg\" (UID: \"8662d67d-6dbb-4156-8a34-a13e650bb745\") " pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.954922 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8662d67d-6dbb-4156-8a34-a13e650bb745-operator-scripts\") pod \"glance-9e8c-account-create-update-prqsg\" (UID: \"8662d67d-6dbb-4156-8a34-a13e650bb745\") " pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.955099 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-operator-scripts\") pod \"glance-db-create-4bjxd\" (UID: \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\") " pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.975067 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbxvf\" (UniqueName: \"kubernetes.io/projected/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-kube-api-access-gbxvf\") pod \"glance-db-create-4bjxd\" (UID: \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\") " pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:14.988520 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmms\" (UniqueName: \"kubernetes.io/projected/8662d67d-6dbb-4156-8a34-a13e650bb745-kube-api-access-hkmms\") pod \"glance-9e8c-account-create-update-prqsg\" (UID: \"8662d67d-6dbb-4156-8a34-a13e650bb745\") " pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.020110 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.025148 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.378056 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fbfnm" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 22:11:15 crc kubenswrapper[4919]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 22:11:15 crc kubenswrapper[4919]: > Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.403206 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 22:11:15 crc kubenswrapper[4919]: W0310 22:11:15.408529 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91c8bbf6_8824_4e21_a491_86f2f657549a.slice/crio-1d4948d46b248f570bd67c036de6ebdf57bd25a57b9b44ba8b1416c577c5779d WatchSource:0}: Error finding container 1d4948d46b248f570bd67c036de6ebdf57bd25a57b9b44ba8b1416c577c5779d: Status 404 returned error can't find the container with id 1d4948d46b248f570bd67c036de6ebdf57bd25a57b9b44ba8b1416c577c5779d Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.468459 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9e8c-account-create-update-prqsg"] Mar 10 22:11:15 crc kubenswrapper[4919]: W0310 22:11:15.473676 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8662d67d_6dbb_4156_8a34_a13e650bb745.slice/crio-788f2cd6404100153b97574a47435e20c74195e7a314ebe1865920c07c496efa WatchSource:0}: Error finding container 788f2cd6404100153b97574a47435e20c74195e7a314ebe1865920c07c496efa: Status 404 returned error can't find the container with id 788f2cd6404100153b97574a47435e20c74195e7a314ebe1865920c07c496efa Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.537704 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4bjxd"] Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.690755 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e8c-account-create-update-prqsg" event={"ID":"8662d67d-6dbb-4156-8a34-a13e650bb745","Type":"ContainerStarted","Data":"d260e7d4813d4ebda28235e9d31249a155784dc638f6003f6270f056f891383a"} Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.691014 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e8c-account-create-update-prqsg" event={"ID":"8662d67d-6dbb-4156-8a34-a13e650bb745","Type":"ContainerStarted","Data":"788f2cd6404100153b97574a47435e20c74195e7a314ebe1865920c07c496efa"} Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.692497 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4bjxd" event={"ID":"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88","Type":"ContainerStarted","Data":"f19fe15f09263bcc48907f378bd5f9fe918a057a8af224365585cbf7e60cb441"} Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.692541 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4bjxd" event={"ID":"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88","Type":"ContainerStarted","Data":"963b58b26f571757b983927ddce31a05aefb0042c5854c936e8583563d4a5cb8"} Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.694355 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"1d4948d46b248f570bd67c036de6ebdf57bd25a57b9b44ba8b1416c577c5779d"} Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.704804 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9e8c-account-create-update-prqsg" podStartSLOduration=1.704783826 podStartE2EDuration="1.704783826s" podCreationTimestamp="2026-03-10 22:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:15.701911989 +0000 UTC m=+1262.943792617" watchObservedRunningTime="2026-03-10 22:11:15.704783826 +0000 UTC m=+1262.946664434" Mar 10 22:11:15 crc kubenswrapper[4919]: I0310 22:11:15.720159 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-4bjxd" podStartSLOduration=1.720138733 podStartE2EDuration="1.720138733s" podCreationTimestamp="2026-03-10 22:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:15.71597691 +0000 UTC m=+1262.957857528" watchObservedRunningTime="2026-03-10 22:11:15.720138733 +0000 UTC m=+1262.962019341" Mar 10 22:11:16 crc kubenswrapper[4919]: I0310 22:11:16.710131 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"2c39ba342378f6fa53b3c9079da22731dd39cfe3e751a36987c68151f2ee3a38"} Mar 10 22:11:16 crc kubenswrapper[4919]: I0310 22:11:16.710782 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"d87f46352498df2659c2cb9f7933207812d2b038ce489fc1d9131bb625190926"} Mar 10 22:11:16 crc kubenswrapper[4919]: I0310 22:11:16.712874 4919 generic.go:334] "Generic (PLEG): container finished" podID="8662d67d-6dbb-4156-8a34-a13e650bb745" containerID="d260e7d4813d4ebda28235e9d31249a155784dc638f6003f6270f056f891383a" exitCode=0 Mar 10 22:11:16 crc kubenswrapper[4919]: I0310 22:11:16.713066 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e8c-account-create-update-prqsg" event={"ID":"8662d67d-6dbb-4156-8a34-a13e650bb745","Type":"ContainerDied","Data":"d260e7d4813d4ebda28235e9d31249a155784dc638f6003f6270f056f891383a"} Mar 10 22:11:16 crc kubenswrapper[4919]: I0310 22:11:16.717977 4919 generic.go:334] "Generic (PLEG): container finished" podID="c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88" containerID="f19fe15f09263bcc48907f378bd5f9fe918a057a8af224365585cbf7e60cb441" exitCode=0 Mar 10 22:11:16 crc kubenswrapper[4919]: I0310 22:11:16.718032 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4bjxd" event={"ID":"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88","Type":"ContainerDied","Data":"f19fe15f09263bcc48907f378bd5f9fe918a057a8af224365585cbf7e60cb441"} Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.208425 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-sdm6h"] Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.214602 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-sdm6h"] Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.489250 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce777ff-58ad-40ac-83cd-d8a9993d77b3" path="/var/lib/kubelet/pods/cce777ff-58ad-40ac-83cd-d8a9993d77b3/volumes" Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.729555 4919 generic.go:334] "Generic (PLEG): container finished" podID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" containerID="1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3" exitCode=0 Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.729635 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa3e6892-7a97-4563-b339-6c3acfd36dd3","Type":"ContainerDied","Data":"1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3"} Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.733012 4919 generic.go:334] "Generic (PLEG): container finished" podID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerID="751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99" exitCode=0 Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.733069 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fe05756-9202-4514-8eea-0c786a2b6d56","Type":"ContainerDied","Data":"751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99"} Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.743123 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"f84605aa41e553170463de5ffc1ffb79d9063aeb307fca1c7396bcab45897e17"} Mar 10 22:11:17 crc kubenswrapper[4919]: I0310 22:11:17.743221 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"8406c65e24e9f21076831c641cab8e21b69c623c1d92a90d609d4a3c30670852"} Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.352626 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.357790 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.539224 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-operator-scripts\") pod \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\" (UID: \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\") " Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.539313 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbxvf\" (UniqueName: \"kubernetes.io/projected/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-kube-api-access-gbxvf\") pod \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\" (UID: \"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88\") " Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.539385 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkmms\" (UniqueName: \"kubernetes.io/projected/8662d67d-6dbb-4156-8a34-a13e650bb745-kube-api-access-hkmms\") pod \"8662d67d-6dbb-4156-8a34-a13e650bb745\" (UID: \"8662d67d-6dbb-4156-8a34-a13e650bb745\") " Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.539638 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8662d67d-6dbb-4156-8a34-a13e650bb745-operator-scripts\") pod \"8662d67d-6dbb-4156-8a34-a13e650bb745\" (UID: \"8662d67d-6dbb-4156-8a34-a13e650bb745\") " Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.540236 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88" (UID: "c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.540278 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8662d67d-6dbb-4156-8a34-a13e650bb745-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8662d67d-6dbb-4156-8a34-a13e650bb745" (UID: "8662d67d-6dbb-4156-8a34-a13e650bb745"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.543193 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8662d67d-6dbb-4156-8a34-a13e650bb745-kube-api-access-hkmms" (OuterVolumeSpecName: "kube-api-access-hkmms") pod "8662d67d-6dbb-4156-8a34-a13e650bb745" (UID: "8662d67d-6dbb-4156-8a34-a13e650bb745"). InnerVolumeSpecName "kube-api-access-hkmms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.545317 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-kube-api-access-gbxvf" (OuterVolumeSpecName: "kube-api-access-gbxvf") pod "c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88" (UID: "c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88"). InnerVolumeSpecName "kube-api-access-gbxvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.641518 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8662d67d-6dbb-4156-8a34-a13e650bb745-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.641558 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.641569 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbxvf\" (UniqueName: \"kubernetes.io/projected/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88-kube-api-access-gbxvf\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.641583 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkmms\" (UniqueName: \"kubernetes.io/projected/8662d67d-6dbb-4156-8a34-a13e650bb745-kube-api-access-hkmms\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.753465 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e8c-account-create-update-prqsg" event={"ID":"8662d67d-6dbb-4156-8a34-a13e650bb745","Type":"ContainerDied","Data":"788f2cd6404100153b97574a47435e20c74195e7a314ebe1865920c07c496efa"} Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.753817 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="788f2cd6404100153b97574a47435e20c74195e7a314ebe1865920c07c496efa" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.753492 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e8c-account-create-update-prqsg" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.756060 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4bjxd" event={"ID":"c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88","Type":"ContainerDied","Data":"963b58b26f571757b983927ddce31a05aefb0042c5854c936e8583563d4a5cb8"} Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.756090 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4bjxd" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.756101 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="963b58b26f571757b983927ddce31a05aefb0042c5854c936e8583563d4a5cb8" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.759639 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"8e08fa3055d8aa50b60d141bc84fe0f51f25e9fc45728c0dd1491d1bc7a66860"} Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.759698 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"75261d329b8223b1135cf1458a80f97a9d45e26831f1353325eb35731b37f5d2"} Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.761647 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa3e6892-7a97-4563-b339-6c3acfd36dd3","Type":"ContainerStarted","Data":"8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d"} Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.761861 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.784707 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fe05756-9202-4514-8eea-0c786a2b6d56","Type":"ContainerStarted","Data":"574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913"} Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.784961 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.801247 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.965865774 podStartE2EDuration="59.801232801s" podCreationTimestamp="2026-03-10 22:10:19 +0000 UTC" firstStartedPulling="2026-03-10 22:10:33.918995833 +0000 UTC m=+1221.160876441" lastFinishedPulling="2026-03-10 22:10:43.75436286 +0000 UTC m=+1230.996243468" observedRunningTime="2026-03-10 22:11:18.799905185 +0000 UTC m=+1266.041785793" watchObservedRunningTime="2026-03-10 22:11:18.801232801 +0000 UTC m=+1266.043113399" Mar 10 22:11:18 crc kubenswrapper[4919]: I0310 22:11:18.831537 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.608993505 podStartE2EDuration="59.831517221s" podCreationTimestamp="2026-03-10 22:10:19 +0000 UTC" firstStartedPulling="2026-03-10 22:10:33.520233329 +0000 UTC m=+1220.762113937" lastFinishedPulling="2026-03-10 22:10:43.742757035 +0000 UTC m=+1230.984637653" observedRunningTime="2026-03-10 22:11:18.82003017 +0000 UTC m=+1266.061910798" watchObservedRunningTime="2026-03-10 22:11:18.831517221 +0000 UTC m=+1266.073397829" Mar 10 22:11:19 crc kubenswrapper[4919]: I0310 22:11:19.798496 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"ea01dce7b06601c355d1e0c1f5bc23af1e7381afe0f09aa8a98efd1dceeab0e9"} Mar 10 22:11:19 crc kubenswrapper[4919]: I0310 22:11:19.798800 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"5f25a2d98dbdc44c683281bef776d43cbc4121e4fa9cb5254edd868686128030"} Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.005209 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rcb6h"] Mar 10 22:11:20 crc kubenswrapper[4919]: E0310 22:11:20.005629 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88" containerName="mariadb-database-create" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.005651 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88" containerName="mariadb-database-create" Mar 10 22:11:20 crc kubenswrapper[4919]: E0310 22:11:20.005684 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8662d67d-6dbb-4156-8a34-a13e650bb745" containerName="mariadb-account-create-update" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.005694 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8662d67d-6dbb-4156-8a34-a13e650bb745" containerName="mariadb-account-create-update" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.005915 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88" containerName="mariadb-database-create" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.005938 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8662d67d-6dbb-4156-8a34-a13e650bb745" containerName="mariadb-account-create-update" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.006583 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.009505 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xjqfl" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.009574 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.019282 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rcb6h"] Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.066214 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-combined-ca-bundle\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.066273 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fsrk\" (UniqueName: \"kubernetes.io/projected/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-kube-api-access-5fsrk\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.066420 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-db-sync-config-data\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.066455 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-config-data\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.168039 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-combined-ca-bundle\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.168910 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fsrk\" (UniqueName: \"kubernetes.io/projected/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-kube-api-access-5fsrk\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.168979 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-db-sync-config-data\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.169007 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-config-data\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.173735 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-combined-ca-bundle\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.174649 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-db-sync-config-data\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.175297 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-config-data\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.191081 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fsrk\" (UniqueName: \"kubernetes.io/projected/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-kube-api-access-5fsrk\") pod \"glance-db-sync-rcb6h\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.328913 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.390876 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fbfnm" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 22:11:20 crc kubenswrapper[4919]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 22:11:20 crc kubenswrapper[4919]: > Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.441027 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.456969 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.719405 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fbfnm-config-mkwp5"] Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.720470 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.724653 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.726195 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fbfnm-config-mkwp5"] Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.790342 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-log-ovn\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.790806 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-additional-scripts\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.790851 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run-ovn\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.790907 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-scripts\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.791102 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9tj\" (UniqueName: \"kubernetes.io/projected/1ccb77be-0e60-4e99-8810-955d5d445b48-kube-api-access-xt9tj\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.791131 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.824141 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"ae368876337c9d3b40fae442133c17eb2b857ea33f295ec663500c6b4ebd5fb3"} Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.824185 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"48c8ed23f216106829b3f2774da116741d89515f11a103cf91248086130b0d18"} Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.824195 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"fe3a820aaaecc3dbdc2c00e963ea20455282c2cd0ff782ad5b053eaa18c2a728"} Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.824203 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"1ab1280d2d068201d8bde5433767f956a9a5d9d033ba36ade9a16c74b928af27"} Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.893346 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-log-ovn\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.893437 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-additional-scripts\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.893468 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run-ovn\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.893491 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-scripts\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.893577 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9tj\" (UniqueName: \"kubernetes.io/projected/1ccb77be-0e60-4e99-8810-955d5d445b48-kube-api-access-xt9tj\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.893615 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.893635 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-log-ovn\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.893724 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.894468 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run-ovn\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.894872 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-additional-scripts\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.895607 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-scripts\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.914585 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9tj\" (UniqueName: \"kubernetes.io/projected/1ccb77be-0e60-4e99-8810-955d5d445b48-kube-api-access-xt9tj\") pod \"ovn-controller-fbfnm-config-mkwp5\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:20 crc kubenswrapper[4919]: I0310 22:11:20.983580 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rcb6h"] Mar 10 22:11:20 crc kubenswrapper[4919]: W0310 22:11:20.987333 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14a85d0b_0c54_4da3_8646_b3e0ca6fe5d5.slice/crio-e983971d2011f0819a459aa42e888c5f455b8dc884665ad05e34206351b9fe20 WatchSource:0}: Error finding container e983971d2011f0819a459aa42e888c5f455b8dc884665ad05e34206351b9fe20: Status 404 returned error can't find the container with id e983971d2011f0819a459aa42e888c5f455b8dc884665ad05e34206351b9fe20 Mar 10 22:11:21 crc kubenswrapper[4919]: I0310 22:11:21.051377 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:21 crc kubenswrapper[4919]: W0310 22:11:21.490374 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ccb77be_0e60_4e99_8810_955d5d445b48.slice/crio-ae62c8cd2de578aa46172e8f7492f381552d982b6f3a7b71c4cb83a62fef5b6e WatchSource:0}: Error finding container ae62c8cd2de578aa46172e8f7492f381552d982b6f3a7b71c4cb83a62fef5b6e: Status 404 returned error can't find the container with id ae62c8cd2de578aa46172e8f7492f381552d982b6f3a7b71c4cb83a62fef5b6e Mar 10 22:11:21 crc kubenswrapper[4919]: I0310 22:11:21.494653 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fbfnm-config-mkwp5"] Mar 10 22:11:21 crc kubenswrapper[4919]: I0310 22:11:21.830860 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fbfnm-config-mkwp5" event={"ID":"1ccb77be-0e60-4e99-8810-955d5d445b48","Type":"ContainerStarted","Data":"ae62c8cd2de578aa46172e8f7492f381552d982b6f3a7b71c4cb83a62fef5b6e"} Mar 10 22:11:21 crc kubenswrapper[4919]: I0310 22:11:21.837092 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"b9b5e9d2ec6d050219cb0112cd09d62a15be687c7cb44e610e55e8bc795ce60f"} Mar 10 22:11:21 crc kubenswrapper[4919]: I0310 22:11:21.838165 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rcb6h" event={"ID":"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5","Type":"ContainerStarted","Data":"e983971d2011f0819a459aa42e888c5f455b8dc884665ad05e34206351b9fe20"} Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.150346 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.223535 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dn8pf"] Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.226461 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.232073 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.241653 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dn8pf"] Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.314324 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbdkz\" (UniqueName: \"kubernetes.io/projected/6399309a-f5f6-4a74-ac8c-8e806984cee9-kube-api-access-zbdkz\") pod \"root-account-create-update-dn8pf\" (UID: \"6399309a-f5f6-4a74-ac8c-8e806984cee9\") " pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.314382 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6399309a-f5f6-4a74-ac8c-8e806984cee9-operator-scripts\") pod \"root-account-create-update-dn8pf\" (UID: \"6399309a-f5f6-4a74-ac8c-8e806984cee9\") " pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.415656 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbdkz\" (UniqueName: \"kubernetes.io/projected/6399309a-f5f6-4a74-ac8c-8e806984cee9-kube-api-access-zbdkz\") pod \"root-account-create-update-dn8pf\" (UID: \"6399309a-f5f6-4a74-ac8c-8e806984cee9\") " pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.415711 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6399309a-f5f6-4a74-ac8c-8e806984cee9-operator-scripts\") pod \"root-account-create-update-dn8pf\" (UID: \"6399309a-f5f6-4a74-ac8c-8e806984cee9\") " pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.416384 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6399309a-f5f6-4a74-ac8c-8e806984cee9-operator-scripts\") pod \"root-account-create-update-dn8pf\" (UID: \"6399309a-f5f6-4a74-ac8c-8e806984cee9\") " pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.433169 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbdkz\" (UniqueName: \"kubernetes.io/projected/6399309a-f5f6-4a74-ac8c-8e806984cee9-kube-api-access-zbdkz\") pod \"root-account-create-update-dn8pf\" (UID: \"6399309a-f5f6-4a74-ac8c-8e806984cee9\") " pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.563061 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:22 crc kubenswrapper[4919]: I0310 22:11:22.854866 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"d4f089226b859cf9e472a23e4abfff98df12043dbab19a145b4c4fdfb8923fe6"} Mar 10 22:11:23 crc kubenswrapper[4919]: I0310 22:11:23.041543 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dn8pf"] Mar 10 22:11:23 crc kubenswrapper[4919]: I0310 22:11:23.871779 4919 generic.go:334] "Generic (PLEG): container finished" podID="6399309a-f5f6-4a74-ac8c-8e806984cee9" containerID="4f99e6cf36c4ed13e551e93f2fa19e48205056c672df3ef2acaf8d7e18c63129" exitCode=0 Mar 10 22:11:23 crc kubenswrapper[4919]: I0310 22:11:23.871931 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dn8pf" event={"ID":"6399309a-f5f6-4a74-ac8c-8e806984cee9","Type":"ContainerDied","Data":"4f99e6cf36c4ed13e551e93f2fa19e48205056c672df3ef2acaf8d7e18c63129"} Mar 10 22:11:23 crc kubenswrapper[4919]: I0310 22:11:23.872080 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dn8pf" event={"ID":"6399309a-f5f6-4a74-ac8c-8e806984cee9","Type":"ContainerStarted","Data":"bdbbeab0c650205269153ab6b66a442b828067d60b62538b08aeeca0bbce080a"} Mar 10 22:11:23 crc kubenswrapper[4919]: I0310 22:11:23.874478 4919 generic.go:334] "Generic (PLEG): container finished" podID="1ccb77be-0e60-4e99-8810-955d5d445b48" containerID="8e1955911205622f71307953f67465667ebb2fac9ec9d00869e93c81c2720854" exitCode=0 Mar 10 22:11:23 crc kubenswrapper[4919]: I0310 22:11:23.874565 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fbfnm-config-mkwp5" event={"ID":"1ccb77be-0e60-4e99-8810-955d5d445b48","Type":"ContainerDied","Data":"8e1955911205622f71307953f67465667ebb2fac9ec9d00869e93c81c2720854"} Mar 10 22:11:23 crc kubenswrapper[4919]: I0310 22:11:23.883050 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerStarted","Data":"7d6f077cbbd4ed720f528f14962aa759a22f7956036b9e97a87b6414a73da0ba"} Mar 10 22:11:23 crc kubenswrapper[4919]: I0310 22:11:23.944168 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=23.490713532 podStartE2EDuration="27.944144832s" podCreationTimestamp="2026-03-10 22:10:56 +0000 UTC" firstStartedPulling="2026-03-10 22:11:15.410546025 +0000 UTC m=+1262.652426633" lastFinishedPulling="2026-03-10 22:11:19.863977325 +0000 UTC m=+1267.105857933" observedRunningTime="2026-03-10 22:11:23.939910707 +0000 UTC m=+1271.181791335" watchObservedRunningTime="2026-03-10 22:11:23.944144832 +0000 UTC m=+1271.186025440" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.204094 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-bxf6w"] Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.207012 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.209855 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.222328 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-bxf6w"] Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.351730 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.351782 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.351802 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-svc\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.352136 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49nvv\" (UniqueName: \"kubernetes.io/projected/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-kube-api-access-49nvv\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.352303 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.352383 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-config\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.456709 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.456755 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-config\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.456805 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.456844 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.456862 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-svc\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.456933 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49nvv\" (UniqueName: \"kubernetes.io/projected/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-kube-api-access-49nvv\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.457916 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-config\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.457932 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.457928 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.458488 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.458683 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-svc\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.485156 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49nvv\" (UniqueName: \"kubernetes.io/projected/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-kube-api-access-49nvv\") pod \"dnsmasq-dns-6d74f8fb89-bxf6w\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:24 crc kubenswrapper[4919]: I0310 22:11:24.524813 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.066411 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-bxf6w"] Mar 10 22:11:25 crc kubenswrapper[4919]: W0310 22:11:25.075130 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0851c6f_1817_40b4_a9b1_b0d3b95e9b5b.slice/crio-65db7f15ece34513ec2cf14efea17490829cd3b67811ba9ec02c293c7c36bc96 WatchSource:0}: Error finding container 65db7f15ece34513ec2cf14efea17490829cd3b67811ba9ec02c293c7c36bc96: Status 404 returned error can't find the container with id 65db7f15ece34513ec2cf14efea17490829cd3b67811ba9ec02c293c7c36bc96 Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.168542 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.250539 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.280480 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6399309a-f5f6-4a74-ac8c-8e806984cee9-operator-scripts\") pod \"6399309a-f5f6-4a74-ac8c-8e806984cee9\" (UID: \"6399309a-f5f6-4a74-ac8c-8e806984cee9\") " Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.280561 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbdkz\" (UniqueName: \"kubernetes.io/projected/6399309a-f5f6-4a74-ac8c-8e806984cee9-kube-api-access-zbdkz\") pod \"6399309a-f5f6-4a74-ac8c-8e806984cee9\" (UID: \"6399309a-f5f6-4a74-ac8c-8e806984cee9\") " Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.281270 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6399309a-f5f6-4a74-ac8c-8e806984cee9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6399309a-f5f6-4a74-ac8c-8e806984cee9" (UID: "6399309a-f5f6-4a74-ac8c-8e806984cee9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.284548 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6399309a-f5f6-4a74-ac8c-8e806984cee9-kube-api-access-zbdkz" (OuterVolumeSpecName: "kube-api-access-zbdkz") pod "6399309a-f5f6-4a74-ac8c-8e806984cee9" (UID: "6399309a-f5f6-4a74-ac8c-8e806984cee9"). InnerVolumeSpecName "kube-api-access-zbdkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.379364 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fbfnm" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.381738 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt9tj\" (UniqueName: \"kubernetes.io/projected/1ccb77be-0e60-4e99-8810-955d5d445b48-kube-api-access-xt9tj\") pod \"1ccb77be-0e60-4e99-8810-955d5d445b48\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.381776 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-log-ovn\") pod \"1ccb77be-0e60-4e99-8810-955d5d445b48\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.381838 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-additional-scripts\") pod \"1ccb77be-0e60-4e99-8810-955d5d445b48\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.381895 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run\") pod \"1ccb77be-0e60-4e99-8810-955d5d445b48\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.381929 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1ccb77be-0e60-4e99-8810-955d5d445b48" (UID: "1ccb77be-0e60-4e99-8810-955d5d445b48"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.381941 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-scripts\") pod \"1ccb77be-0e60-4e99-8810-955d5d445b48\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.381973 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run-ovn\") pod \"1ccb77be-0e60-4e99-8810-955d5d445b48\" (UID: \"1ccb77be-0e60-4e99-8810-955d5d445b48\") " Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.382439 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbdkz\" (UniqueName: \"kubernetes.io/projected/6399309a-f5f6-4a74-ac8c-8e806984cee9-kube-api-access-zbdkz\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.382463 4919 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.382476 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6399309a-f5f6-4a74-ac8c-8e806984cee9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.382500 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1ccb77be-0e60-4e99-8810-955d5d445b48" (UID: "1ccb77be-0e60-4e99-8810-955d5d445b48"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.382926 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run" (OuterVolumeSpecName: "var-run") pod "1ccb77be-0e60-4e99-8810-955d5d445b48" (UID: "1ccb77be-0e60-4e99-8810-955d5d445b48"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.382953 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1ccb77be-0e60-4e99-8810-955d5d445b48" (UID: "1ccb77be-0e60-4e99-8810-955d5d445b48"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.383696 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-scripts" (OuterVolumeSpecName: "scripts") pod "1ccb77be-0e60-4e99-8810-955d5d445b48" (UID: "1ccb77be-0e60-4e99-8810-955d5d445b48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.389791 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ccb77be-0e60-4e99-8810-955d5d445b48-kube-api-access-xt9tj" (OuterVolumeSpecName: "kube-api-access-xt9tj") pod "1ccb77be-0e60-4e99-8810-955d5d445b48" (UID: "1ccb77be-0e60-4e99-8810-955d5d445b48"). InnerVolumeSpecName "kube-api-access-xt9tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.483858 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.484157 4919 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.484215 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt9tj\" (UniqueName: \"kubernetes.io/projected/1ccb77be-0e60-4e99-8810-955d5d445b48-kube-api-access-xt9tj\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.484230 4919 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ccb77be-0e60-4e99-8810-955d5d445b48-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.484241 4919 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ccb77be-0e60-4e99-8810-955d5d445b48-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.915620 4919 generic.go:334] "Generic (PLEG): container finished" podID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerID="9ff133bacb3b951d06d4a258f0292d4d3d314a4d3d137bbfe2ab96bb435b84c7" exitCode=0 Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.915894 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" event={"ID":"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b","Type":"ContainerDied","Data":"9ff133bacb3b951d06d4a258f0292d4d3d314a4d3d137bbfe2ab96bb435b84c7"} Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.916224 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" event={"ID":"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b","Type":"ContainerStarted","Data":"65db7f15ece34513ec2cf14efea17490829cd3b67811ba9ec02c293c7c36bc96"} Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.919562 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fbfnm-config-mkwp5" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.919775 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fbfnm-config-mkwp5" event={"ID":"1ccb77be-0e60-4e99-8810-955d5d445b48","Type":"ContainerDied","Data":"ae62c8cd2de578aa46172e8f7492f381552d982b6f3a7b71c4cb83a62fef5b6e"} Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.919822 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae62c8cd2de578aa46172e8f7492f381552d982b6f3a7b71c4cb83a62fef5b6e" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.922660 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dn8pf" event={"ID":"6399309a-f5f6-4a74-ac8c-8e806984cee9","Type":"ContainerDied","Data":"bdbbeab0c650205269153ab6b66a442b828067d60b62538b08aeeca0bbce080a"} Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.922702 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdbbeab0c650205269153ab6b66a442b828067d60b62538b08aeeca0bbce080a" Mar 10 22:11:25 crc kubenswrapper[4919]: I0310 22:11:25.922742 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dn8pf" Mar 10 22:11:26 crc kubenswrapper[4919]: I0310 22:11:26.354695 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fbfnm-config-mkwp5"] Mar 10 22:11:26 crc kubenswrapper[4919]: I0310 22:11:26.369194 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fbfnm-config-mkwp5"] Mar 10 22:11:26 crc kubenswrapper[4919]: I0310 22:11:26.935462 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" event={"ID":"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b","Type":"ContainerStarted","Data":"a1c98cb48521f20303a44e0f87321846803fe8c851e0a4d4b5f139c570f1c0b6"} Mar 10 22:11:26 crc kubenswrapper[4919]: I0310 22:11:26.936882 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:26 crc kubenswrapper[4919]: I0310 22:11:26.959632 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" podStartSLOduration=2.959612151 podStartE2EDuration="2.959612151s" podCreationTimestamp="2026-03-10 22:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:26.95511265 +0000 UTC m=+1274.196993248" watchObservedRunningTime="2026-03-10 22:11:26.959612151 +0000 UTC m=+1274.201492759" Mar 10 22:11:27 crc kubenswrapper[4919]: I0310 22:11:27.490302 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ccb77be-0e60-4e99-8810-955d5d445b48" path="/var/lib/kubelet/pods/1ccb77be-0e60-4e99-8810-955d5d445b48/volumes" Mar 10 22:11:30 crc kubenswrapper[4919]: I0310 22:11:30.667562 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:11:30 crc kubenswrapper[4919]: I0310 22:11:30.910958 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.447647 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jdgb7"] Mar 10 22:11:32 crc kubenswrapper[4919]: E0310 22:11:32.448242 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ccb77be-0e60-4e99-8810-955d5d445b48" containerName="ovn-config" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.448254 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ccb77be-0e60-4e99-8810-955d5d445b48" containerName="ovn-config" Mar 10 22:11:32 crc kubenswrapper[4919]: E0310 22:11:32.448277 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6399309a-f5f6-4a74-ac8c-8e806984cee9" containerName="mariadb-account-create-update" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.448285 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="6399309a-f5f6-4a74-ac8c-8e806984cee9" containerName="mariadb-account-create-update" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.448454 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ccb77be-0e60-4e99-8810-955d5d445b48" containerName="ovn-config" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.448474 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="6399309a-f5f6-4a74-ac8c-8e806984cee9" containerName="mariadb-account-create-update" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.448956 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.463228 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jdgb7"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.560632 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-105f-account-create-update-g2tcv"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.561698 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.564330 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.572407 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-105f-account-create-update-g2tcv"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.613232 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c9d4c1-5253-49d5-8ade-272d01956b72-operator-scripts\") pod \"cinder-db-create-jdgb7\" (UID: \"53c9d4c1-5253-49d5-8ade-272d01956b72\") " pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.613308 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2d7q\" (UniqueName: \"kubernetes.io/projected/53c9d4c1-5253-49d5-8ade-272d01956b72-kube-api-access-b2d7q\") pod \"cinder-db-create-jdgb7\" (UID: \"53c9d4c1-5253-49d5-8ade-272d01956b72\") " pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.649297 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7whbn"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.650471 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.660576 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8e40-account-create-update-c8zqv"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.661638 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.663322 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.675910 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7whbn"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.683982 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8e40-account-create-update-c8zqv"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.715887 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdcpr\" (UniqueName: \"kubernetes.io/projected/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-kube-api-access-rdcpr\") pod \"cinder-105f-account-create-update-g2tcv\" (UID: \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\") " pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.715941 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2d7q\" (UniqueName: \"kubernetes.io/projected/53c9d4c1-5253-49d5-8ade-272d01956b72-kube-api-access-b2d7q\") pod \"cinder-db-create-jdgb7\" (UID: \"53c9d4c1-5253-49d5-8ade-272d01956b72\") " pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.716042 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-operator-scripts\") pod \"cinder-105f-account-create-update-g2tcv\" (UID: \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\") " pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.716082 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c9d4c1-5253-49d5-8ade-272d01956b72-operator-scripts\") pod \"cinder-db-create-jdgb7\" (UID: \"53c9d4c1-5253-49d5-8ade-272d01956b72\") " pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.717168 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c9d4c1-5253-49d5-8ade-272d01956b72-operator-scripts\") pod \"cinder-db-create-jdgb7\" (UID: \"53c9d4c1-5253-49d5-8ade-272d01956b72\") " pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.750599 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2d7q\" (UniqueName: \"kubernetes.io/projected/53c9d4c1-5253-49d5-8ade-272d01956b72-kube-api-access-b2d7q\") pod \"cinder-db-create-jdgb7\" (UID: \"53c9d4c1-5253-49d5-8ade-272d01956b72\") " pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.770013 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.810024 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ctpbj"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.811039 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.813572 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.814651 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nbmmg" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.814815 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.814973 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.818084 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c190e44-b111-4a65-9700-d0255aa11800-operator-scripts\") pod \"barbican-db-create-7whbn\" (UID: \"0c190e44-b111-4a65-9700-d0255aa11800\") " pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.818163 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78xz7\" (UniqueName: \"kubernetes.io/projected/0c190e44-b111-4a65-9700-d0255aa11800-kube-api-access-78xz7\") pod \"barbican-db-create-7whbn\" (UID: \"0c190e44-b111-4a65-9700-d0255aa11800\") " pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.818181 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw88c\" (UniqueName: \"kubernetes.io/projected/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-kube-api-access-mw88c\") pod \"barbican-8e40-account-create-update-c8zqv\" (UID: \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\") " pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.818238 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-operator-scripts\") pod \"barbican-8e40-account-create-update-c8zqv\" (UID: \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\") " pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.818269 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-operator-scripts\") pod \"cinder-105f-account-create-update-g2tcv\" (UID: \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\") " pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.818351 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdcpr\" (UniqueName: \"kubernetes.io/projected/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-kube-api-access-rdcpr\") pod \"cinder-105f-account-create-update-g2tcv\" (UID: \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\") " pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.819969 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-operator-scripts\") pod \"cinder-105f-account-create-update-g2tcv\" (UID: \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\") " pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.825317 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ctpbj"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.861119 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdcpr\" (UniqueName: \"kubernetes.io/projected/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-kube-api-access-rdcpr\") pod \"cinder-105f-account-create-update-g2tcv\" (UID: \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\") " pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.877620 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.920000 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-operator-scripts\") pod \"barbican-8e40-account-create-update-c8zqv\" (UID: \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\") " pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.920119 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-combined-ca-bundle\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.920149 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c190e44-b111-4a65-9700-d0255aa11800-operator-scripts\") pod \"barbican-db-create-7whbn\" (UID: \"0c190e44-b111-4a65-9700-d0255aa11800\") " pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.920180 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5z2x\" (UniqueName: \"kubernetes.io/projected/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-kube-api-access-h5z2x\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.920219 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78xz7\" (UniqueName: \"kubernetes.io/projected/0c190e44-b111-4a65-9700-d0255aa11800-kube-api-access-78xz7\") pod \"barbican-db-create-7whbn\" (UID: \"0c190e44-b111-4a65-9700-d0255aa11800\") " pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.920256 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw88c\" (UniqueName: \"kubernetes.io/projected/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-kube-api-access-mw88c\") pod \"barbican-8e40-account-create-update-c8zqv\" (UID: \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\") " pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.920307 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-config-data\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.921010 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-operator-scripts\") pod \"barbican-8e40-account-create-update-c8zqv\" (UID: \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\") " pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.921529 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c190e44-b111-4a65-9700-d0255aa11800-operator-scripts\") pod \"barbican-db-create-7whbn\" (UID: \"0c190e44-b111-4a65-9700-d0255aa11800\") " pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.957787 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zcrjg"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.958814 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.965375 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78xz7\" (UniqueName: \"kubernetes.io/projected/0c190e44-b111-4a65-9700-d0255aa11800-kube-api-access-78xz7\") pod \"barbican-db-create-7whbn\" (UID: \"0c190e44-b111-4a65-9700-d0255aa11800\") " pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.973085 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw88c\" (UniqueName: \"kubernetes.io/projected/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-kube-api-access-mw88c\") pod \"barbican-8e40-account-create-update-c8zqv\" (UID: \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\") " pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.973139 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-45c6-account-create-update-cz7k8"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.974270 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.974408 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zcrjg"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.974859 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.977836 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.982152 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-45c6-account-create-update-cz7k8"] Mar 10 22:11:32 crc kubenswrapper[4919]: I0310 22:11:32.988812 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.025133 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-config-data\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.026492 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-combined-ca-bundle\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.026658 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5z2x\" (UniqueName: \"kubernetes.io/projected/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-kube-api-access-h5z2x\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.030817 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-combined-ca-bundle\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.041688 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-config-data\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.047139 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5z2x\" (UniqueName: \"kubernetes.io/projected/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-kube-api-access-h5z2x\") pod \"keystone-db-sync-ctpbj\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.128041 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bad7d2b-c98f-49e7-86a8-3467f75830f2-operator-scripts\") pod \"neutron-db-create-zcrjg\" (UID: \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\") " pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.128102 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e6114e-103b-4653-b4f0-ba3c216e3437-operator-scripts\") pod \"neutron-45c6-account-create-update-cz7k8\" (UID: \"31e6114e-103b-4653-b4f0-ba3c216e3437\") " pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.128129 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ht4h\" (UniqueName: \"kubernetes.io/projected/31e6114e-103b-4653-b4f0-ba3c216e3437-kube-api-access-4ht4h\") pod \"neutron-45c6-account-create-update-cz7k8\" (UID: \"31e6114e-103b-4653-b4f0-ba3c216e3437\") " pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.128447 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxxhz\" (UniqueName: \"kubernetes.io/projected/1bad7d2b-c98f-49e7-86a8-3467f75830f2-kube-api-access-xxxhz\") pod \"neutron-db-create-zcrjg\" (UID: \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\") " pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.135532 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.229867 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxxhz\" (UniqueName: \"kubernetes.io/projected/1bad7d2b-c98f-49e7-86a8-3467f75830f2-kube-api-access-xxxhz\") pod \"neutron-db-create-zcrjg\" (UID: \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\") " pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.229966 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bad7d2b-c98f-49e7-86a8-3467f75830f2-operator-scripts\") pod \"neutron-db-create-zcrjg\" (UID: \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\") " pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.229996 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e6114e-103b-4653-b4f0-ba3c216e3437-operator-scripts\") pod \"neutron-45c6-account-create-update-cz7k8\" (UID: \"31e6114e-103b-4653-b4f0-ba3c216e3437\") " pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.230016 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ht4h\" (UniqueName: \"kubernetes.io/projected/31e6114e-103b-4653-b4f0-ba3c216e3437-kube-api-access-4ht4h\") pod \"neutron-45c6-account-create-update-cz7k8\" (UID: \"31e6114e-103b-4653-b4f0-ba3c216e3437\") " pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.231532 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bad7d2b-c98f-49e7-86a8-3467f75830f2-operator-scripts\") pod \"neutron-db-create-zcrjg\" (UID: \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\") " pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.231732 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e6114e-103b-4653-b4f0-ba3c216e3437-operator-scripts\") pod \"neutron-45c6-account-create-update-cz7k8\" (UID: \"31e6114e-103b-4653-b4f0-ba3c216e3437\") " pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.253250 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxxhz\" (UniqueName: \"kubernetes.io/projected/1bad7d2b-c98f-49e7-86a8-3467f75830f2-kube-api-access-xxxhz\") pod \"neutron-db-create-zcrjg\" (UID: \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\") " pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.255001 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ht4h\" (UniqueName: \"kubernetes.io/projected/31e6114e-103b-4653-b4f0-ba3c216e3437-kube-api-access-4ht4h\") pod \"neutron-45c6-account-create-update-cz7k8\" (UID: \"31e6114e-103b-4653-b4f0-ba3c216e3437\") " pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.402463 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:33 crc kubenswrapper[4919]: I0310 22:11:33.412606 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:34 crc kubenswrapper[4919]: I0310 22:11:34.528194 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:11:34 crc kubenswrapper[4919]: I0310 22:11:34.572783 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-tz7jv"] Mar 10 22:11:34 crc kubenswrapper[4919]: I0310 22:11:34.574368 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" podUID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerName="dnsmasq-dns" containerID="cri-o://cb2bc5565b179fcfd14f52b85ea1c6e89021235076d927650d687bdcde2adaa2" gracePeriod=10 Mar 10 22:11:35 crc kubenswrapper[4919]: I0310 22:11:35.024465 4919 generic.go:334] "Generic (PLEG): container finished" podID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerID="cb2bc5565b179fcfd14f52b85ea1c6e89021235076d927650d687bdcde2adaa2" exitCode=0 Mar 10 22:11:35 crc kubenswrapper[4919]: I0310 22:11:35.024510 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" event={"ID":"0fdc6e48-f4e3-4c70-8cab-78ad58edd483","Type":"ContainerDied","Data":"cb2bc5565b179fcfd14f52b85ea1c6e89021235076d927650d687bdcde2adaa2"} Mar 10 22:11:35 crc kubenswrapper[4919]: I0310 22:11:35.453986 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" podUID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.418146 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.494668 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8frt\" (UniqueName: \"kubernetes.io/projected/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-kube-api-access-n8frt\") pod \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.494761 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-config\") pod \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.494790 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-sb\") pod \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.494895 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-dns-svc\") pod \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.494947 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-nb\") pod \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\" (UID: \"0fdc6e48-f4e3-4c70-8cab-78ad58edd483\") " Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.510721 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-kube-api-access-n8frt" (OuterVolumeSpecName: "kube-api-access-n8frt") pod "0fdc6e48-f4e3-4c70-8cab-78ad58edd483" (UID: "0fdc6e48-f4e3-4c70-8cab-78ad58edd483"). InnerVolumeSpecName "kube-api-access-n8frt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.556151 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0fdc6e48-f4e3-4c70-8cab-78ad58edd483" (UID: "0fdc6e48-f4e3-4c70-8cab-78ad58edd483"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.556164 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-config" (OuterVolumeSpecName: "config") pod "0fdc6e48-f4e3-4c70-8cab-78ad58edd483" (UID: "0fdc6e48-f4e3-4c70-8cab-78ad58edd483"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.558202 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0fdc6e48-f4e3-4c70-8cab-78ad58edd483" (UID: "0fdc6e48-f4e3-4c70-8cab-78ad58edd483"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.559020 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0fdc6e48-f4e3-4c70-8cab-78ad58edd483" (UID: "0fdc6e48-f4e3-4c70-8cab-78ad58edd483"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.597163 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8frt\" (UniqueName: \"kubernetes.io/projected/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-kube-api-access-n8frt\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.597197 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.597208 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.597219 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.597228 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fdc6e48-f4e3-4c70-8cab-78ad58edd483-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:36 crc kubenswrapper[4919]: W0310 22:11:36.755412 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bad7d2b_c98f_49e7_86a8_3467f75830f2.slice/crio-7ddc2509da92c7ba946824f607e34a5a1184f16921bdd0185f775d7912b2a013 WatchSource:0}: Error finding container 7ddc2509da92c7ba946824f607e34a5a1184f16921bdd0185f775d7912b2a013: Status 404 returned error can't find the container with id 7ddc2509da92c7ba946824f607e34a5a1184f16921bdd0185f775d7912b2a013 Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.755855 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zcrjg"] Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.948555 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jdgb7"] Mar 10 22:11:36 crc kubenswrapper[4919]: W0310 22:11:36.953662 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53c9d4c1_5253_49d5_8ade_272d01956b72.slice/crio-12fe1dcc737ec06977a8a15200cdd61c9dd0ffe1e07591140fad4f06efaf40ec WatchSource:0}: Error finding container 12fe1dcc737ec06977a8a15200cdd61c9dd0ffe1e07591140fad4f06efaf40ec: Status 404 returned error can't find the container with id 12fe1dcc737ec06977a8a15200cdd61c9dd0ffe1e07591140fad4f06efaf40ec Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.959869 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8e40-account-create-update-c8zqv"] Mar 10 22:11:36 crc kubenswrapper[4919]: I0310 22:11:36.983521 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-105f-account-create-update-g2tcv"] Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.047165 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rcb6h" event={"ID":"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5","Type":"ContainerStarted","Data":"df3255a2d0759e4981bfd7e03b09952206e74469af65e2de0fe714948a7f652b"} Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.051113 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-105f-account-create-update-g2tcv" event={"ID":"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15","Type":"ContainerStarted","Data":"b2d9c4b3f013d4658b906a60873db0224adb1772515ba4abfc3a8a7b6f63c888"} Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.054329 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zcrjg" event={"ID":"1bad7d2b-c98f-49e7-86a8-3467f75830f2","Type":"ContainerStarted","Data":"f1cfb380cb13d63aaa674f6ff137ef0d4367d24b15d6edf7643da148391493ad"} Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.054432 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zcrjg" event={"ID":"1bad7d2b-c98f-49e7-86a8-3467f75830f2","Type":"ContainerStarted","Data":"7ddc2509da92c7ba946824f607e34a5a1184f16921bdd0185f775d7912b2a013"} Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.057594 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" event={"ID":"0fdc6e48-f4e3-4c70-8cab-78ad58edd483","Type":"ContainerDied","Data":"822a032f8bc949b5c80d4aa4dc1301f9ff5b6767194a5fcb56a7dcf39d4d4c21"} Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.057773 4919 scope.go:117] "RemoveContainer" containerID="cb2bc5565b179fcfd14f52b85ea1c6e89021235076d927650d687bdcde2adaa2" Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.058067 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-tz7jv" Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.064236 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdgb7" event={"ID":"53c9d4c1-5253-49d5-8ade-272d01956b72","Type":"ContainerStarted","Data":"12fe1dcc737ec06977a8a15200cdd61c9dd0ffe1e07591140fad4f06efaf40ec"} Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.065620 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rcb6h" podStartSLOduration=2.747504215 podStartE2EDuration="18.06560771s" podCreationTimestamp="2026-03-10 22:11:19 +0000 UTC" firstStartedPulling="2026-03-10 22:11:20.989414796 +0000 UTC m=+1268.231295414" lastFinishedPulling="2026-03-10 22:11:36.307518301 +0000 UTC m=+1283.549398909" observedRunningTime="2026-03-10 22:11:37.063797841 +0000 UTC m=+1284.305678449" watchObservedRunningTime="2026-03-10 22:11:37.06560771 +0000 UTC m=+1284.307488318" Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.071811 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8e40-account-create-update-c8zqv" event={"ID":"5759d1d2-d713-4e24-a2fb-c1c6804a4c39","Type":"ContainerStarted","Data":"7824b329785f2f383c95161a6e5a5441162ca8f13e4af13a30f3de135a379142"} Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.086655 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7whbn"] Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.118602 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ctpbj"] Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.128742 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-zcrjg" podStartSLOduration=5.12872149 podStartE2EDuration="5.12872149s" podCreationTimestamp="2026-03-10 22:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:37.085764327 +0000 UTC m=+1284.327644935" watchObservedRunningTime="2026-03-10 22:11:37.12872149 +0000 UTC m=+1284.370602098" Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.134304 4919 scope.go:117] "RemoveContainer" containerID="6f95126f36aeff559b6794298af240695704556b3b7aeebc326d0171c3e41fa0" Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.157501 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-45c6-account-create-update-cz7k8"] Mar 10 22:11:37 crc kubenswrapper[4919]: W0310 22:11:37.167649 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e6114e_103b_4653_b4f0_ba3c216e3437.slice/crio-6d8c836f039aa4e043bc5a7235765b1f07dcf9d616866b7884e04f81d18b6d90 WatchSource:0}: Error finding container 6d8c836f039aa4e043bc5a7235765b1f07dcf9d616866b7884e04f81d18b6d90: Status 404 returned error can't find the container with id 6d8c836f039aa4e043bc5a7235765b1f07dcf9d616866b7884e04f81d18b6d90 Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.170610 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-tz7jv"] Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.178801 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-tz7jv"] Mar 10 22:11:37 crc kubenswrapper[4919]: I0310 22:11:37.495141 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" path="/var/lib/kubelet/pods/0fdc6e48-f4e3-4c70-8cab-78ad58edd483/volumes" Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.082409 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ctpbj" event={"ID":"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c","Type":"ContainerStarted","Data":"5c69c9373fd806d24fc0ce727b9c0866247f8de31bc961caf61513638ae51d31"} Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.084431 4919 generic.go:334] "Generic (PLEG): container finished" podID="31e6114e-103b-4653-b4f0-ba3c216e3437" containerID="17c37ab7d9a4d15efad770799d8f1b0cb204d67a5cf81f20458d249be5df213d" exitCode=0 Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.084472 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-45c6-account-create-update-cz7k8" event={"ID":"31e6114e-103b-4653-b4f0-ba3c216e3437","Type":"ContainerDied","Data":"17c37ab7d9a4d15efad770799d8f1b0cb204d67a5cf81f20458d249be5df213d"} Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.084488 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-45c6-account-create-update-cz7k8" event={"ID":"31e6114e-103b-4653-b4f0-ba3c216e3437","Type":"ContainerStarted","Data":"6d8c836f039aa4e043bc5a7235765b1f07dcf9d616866b7884e04f81d18b6d90"} Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.086535 4919 generic.go:334] "Generic (PLEG): container finished" podID="0c190e44-b111-4a65-9700-d0255aa11800" containerID="677037ec6fc3ac6df63ece5407c9f84f545fc7d8e698c763107303f155ca69f6" exitCode=0 Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.086632 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7whbn" event={"ID":"0c190e44-b111-4a65-9700-d0255aa11800","Type":"ContainerDied","Data":"677037ec6fc3ac6df63ece5407c9f84f545fc7d8e698c763107303f155ca69f6"} Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.086650 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7whbn" event={"ID":"0c190e44-b111-4a65-9700-d0255aa11800","Type":"ContainerStarted","Data":"f1dd8035173539e4ab9327d3e4e791f42f17052c744126cac9d87ef908b6612e"} Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.094080 4919 generic.go:334] "Generic (PLEG): container finished" podID="dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15" containerID="4acacbb159ddc3d2dcc05f7a2181c4a0198ea30fd738a3acdcd1ed55e18abf6b" exitCode=0 Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.094956 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-105f-account-create-update-g2tcv" event={"ID":"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15","Type":"ContainerDied","Data":"4acacbb159ddc3d2dcc05f7a2181c4a0198ea30fd738a3acdcd1ed55e18abf6b"} Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.101592 4919 generic.go:334] "Generic (PLEG): container finished" podID="53c9d4c1-5253-49d5-8ade-272d01956b72" containerID="f8a9ebb35a9684e930bb7e44a0621ffb15e0b669cb105b45857980e222bb8a9c" exitCode=0 Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.101699 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdgb7" event={"ID":"53c9d4c1-5253-49d5-8ade-272d01956b72","Type":"ContainerDied","Data":"f8a9ebb35a9684e930bb7e44a0621ffb15e0b669cb105b45857980e222bb8a9c"} Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.103689 4919 generic.go:334] "Generic (PLEG): container finished" podID="1bad7d2b-c98f-49e7-86a8-3467f75830f2" containerID="f1cfb380cb13d63aaa674f6ff137ef0d4367d24b15d6edf7643da148391493ad" exitCode=0 Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.103774 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zcrjg" event={"ID":"1bad7d2b-c98f-49e7-86a8-3467f75830f2","Type":"ContainerDied","Data":"f1cfb380cb13d63aaa674f6ff137ef0d4367d24b15d6edf7643da148391493ad"} Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.105606 4919 generic.go:334] "Generic (PLEG): container finished" podID="5759d1d2-d713-4e24-a2fb-c1c6804a4c39" containerID="55306eee2a2abd644ed6e52709eaa16a01bae893d58ffb9f0076970f427c4738" exitCode=0 Mar 10 22:11:38 crc kubenswrapper[4919]: I0310 22:11:38.106358 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8e40-account-create-update-c8zqv" event={"ID":"5759d1d2-d713-4e24-a2fb-c1c6804a4c39","Type":"ContainerDied","Data":"55306eee2a2abd644ed6e52709eaa16a01bae893d58ffb9f0076970f427c4738"} Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.139521 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zcrjg" event={"ID":"1bad7d2b-c98f-49e7-86a8-3467f75830f2","Type":"ContainerDied","Data":"7ddc2509da92c7ba946824f607e34a5a1184f16921bdd0185f775d7912b2a013"} Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.139997 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ddc2509da92c7ba946824f607e34a5a1184f16921bdd0185f775d7912b2a013" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.341187 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.354711 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.382579 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.393756 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.402604 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.414744 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481211 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2d7q\" (UniqueName: \"kubernetes.io/projected/53c9d4c1-5253-49d5-8ade-272d01956b72-kube-api-access-b2d7q\") pod \"53c9d4c1-5253-49d5-8ade-272d01956b72\" (UID: \"53c9d4c1-5253-49d5-8ade-272d01956b72\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481337 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-operator-scripts\") pod \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\" (UID: \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481411 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ht4h\" (UniqueName: \"kubernetes.io/projected/31e6114e-103b-4653-b4f0-ba3c216e3437-kube-api-access-4ht4h\") pod \"31e6114e-103b-4653-b4f0-ba3c216e3437\" (UID: \"31e6114e-103b-4653-b4f0-ba3c216e3437\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481464 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw88c\" (UniqueName: \"kubernetes.io/projected/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-kube-api-access-mw88c\") pod \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\" (UID: \"5759d1d2-d713-4e24-a2fb-c1c6804a4c39\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481489 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-operator-scripts\") pod \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\" (UID: \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481523 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxxhz\" (UniqueName: \"kubernetes.io/projected/1bad7d2b-c98f-49e7-86a8-3467f75830f2-kube-api-access-xxxhz\") pod \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\" (UID: \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481560 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bad7d2b-c98f-49e7-86a8-3467f75830f2-operator-scripts\") pod \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\" (UID: \"1bad7d2b-c98f-49e7-86a8-3467f75830f2\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481606 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c190e44-b111-4a65-9700-d0255aa11800-operator-scripts\") pod \"0c190e44-b111-4a65-9700-d0255aa11800\" (UID: \"0c190e44-b111-4a65-9700-d0255aa11800\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481650 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e6114e-103b-4653-b4f0-ba3c216e3437-operator-scripts\") pod \"31e6114e-103b-4653-b4f0-ba3c216e3437\" (UID: \"31e6114e-103b-4653-b4f0-ba3c216e3437\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481690 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdcpr\" (UniqueName: \"kubernetes.io/projected/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-kube-api-access-rdcpr\") pod \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\" (UID: \"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481772 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c9d4c1-5253-49d5-8ade-272d01956b72-operator-scripts\") pod \"53c9d4c1-5253-49d5-8ade-272d01956b72\" (UID: \"53c9d4c1-5253-49d5-8ade-272d01956b72\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.481804 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78xz7\" (UniqueName: \"kubernetes.io/projected/0c190e44-b111-4a65-9700-d0255aa11800-kube-api-access-78xz7\") pod \"0c190e44-b111-4a65-9700-d0255aa11800\" (UID: \"0c190e44-b111-4a65-9700-d0255aa11800\") " Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.482481 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15" (UID: "dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.482589 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c9d4c1-5253-49d5-8ade-272d01956b72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53c9d4c1-5253-49d5-8ade-272d01956b72" (UID: "53c9d4c1-5253-49d5-8ade-272d01956b72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.482697 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c190e44-b111-4a65-9700-d0255aa11800-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c190e44-b111-4a65-9700-d0255aa11800" (UID: "0c190e44-b111-4a65-9700-d0255aa11800"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.483239 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bad7d2b-c98f-49e7-86a8-3467f75830f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1bad7d2b-c98f-49e7-86a8-3467f75830f2" (UID: "1bad7d2b-c98f-49e7-86a8-3467f75830f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.483728 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5759d1d2-d713-4e24-a2fb-c1c6804a4c39" (UID: "5759d1d2-d713-4e24-a2fb-c1c6804a4c39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.485211 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e6114e-103b-4653-b4f0-ba3c216e3437-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31e6114e-103b-4653-b4f0-ba3c216e3437" (UID: "31e6114e-103b-4653-b4f0-ba3c216e3437"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.486669 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-kube-api-access-rdcpr" (OuterVolumeSpecName: "kube-api-access-rdcpr") pod "dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15" (UID: "dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15"). InnerVolumeSpecName "kube-api-access-rdcpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.486722 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bad7d2b-c98f-49e7-86a8-3467f75830f2-kube-api-access-xxxhz" (OuterVolumeSpecName: "kube-api-access-xxxhz") pod "1bad7d2b-c98f-49e7-86a8-3467f75830f2" (UID: "1bad7d2b-c98f-49e7-86a8-3467f75830f2"). InnerVolumeSpecName "kube-api-access-xxxhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.487228 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c190e44-b111-4a65-9700-d0255aa11800-kube-api-access-78xz7" (OuterVolumeSpecName: "kube-api-access-78xz7") pod "0c190e44-b111-4a65-9700-d0255aa11800" (UID: "0c190e44-b111-4a65-9700-d0255aa11800"). InnerVolumeSpecName "kube-api-access-78xz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.487798 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-kube-api-access-mw88c" (OuterVolumeSpecName: "kube-api-access-mw88c") pod "5759d1d2-d713-4e24-a2fb-c1c6804a4c39" (UID: "5759d1d2-d713-4e24-a2fb-c1c6804a4c39"). InnerVolumeSpecName "kube-api-access-mw88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.488362 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e6114e-103b-4653-b4f0-ba3c216e3437-kube-api-access-4ht4h" (OuterVolumeSpecName: "kube-api-access-4ht4h") pod "31e6114e-103b-4653-b4f0-ba3c216e3437" (UID: "31e6114e-103b-4653-b4f0-ba3c216e3437"). InnerVolumeSpecName "kube-api-access-4ht4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.488844 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c9d4c1-5253-49d5-8ade-272d01956b72-kube-api-access-b2d7q" (OuterVolumeSpecName: "kube-api-access-b2d7q") pod "53c9d4c1-5253-49d5-8ade-272d01956b72" (UID: "53c9d4c1-5253-49d5-8ade-272d01956b72"). InnerVolumeSpecName "kube-api-access-b2d7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584222 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e6114e-103b-4653-b4f0-ba3c216e3437-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584263 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdcpr\" (UniqueName: \"kubernetes.io/projected/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-kube-api-access-rdcpr\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584279 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c9d4c1-5253-49d5-8ade-272d01956b72-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584310 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78xz7\" (UniqueName: \"kubernetes.io/projected/0c190e44-b111-4a65-9700-d0255aa11800-kube-api-access-78xz7\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584325 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2d7q\" (UniqueName: \"kubernetes.io/projected/53c9d4c1-5253-49d5-8ade-272d01956b72-kube-api-access-b2d7q\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584336 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584347 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ht4h\" (UniqueName: \"kubernetes.io/projected/31e6114e-103b-4653-b4f0-ba3c216e3437-kube-api-access-4ht4h\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584358 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw88c\" (UniqueName: \"kubernetes.io/projected/5759d1d2-d713-4e24-a2fb-c1c6804a4c39-kube-api-access-mw88c\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584368 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584378 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxxhz\" (UniqueName: \"kubernetes.io/projected/1bad7d2b-c98f-49e7-86a8-3467f75830f2-kube-api-access-xxxhz\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584412 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bad7d2b-c98f-49e7-86a8-3467f75830f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:41 crc kubenswrapper[4919]: I0310 22:11:41.584431 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c190e44-b111-4a65-9700-d0255aa11800-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.149758 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdgb7" event={"ID":"53c9d4c1-5253-49d5-8ade-272d01956b72","Type":"ContainerDied","Data":"12fe1dcc737ec06977a8a15200cdd61c9dd0ffe1e07591140fad4f06efaf40ec"} Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.149793 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12fe1dcc737ec06977a8a15200cdd61c9dd0ffe1e07591140fad4f06efaf40ec" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.149817 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdgb7" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.152469 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8e40-account-create-update-c8zqv" event={"ID":"5759d1d2-d713-4e24-a2fb-c1c6804a4c39","Type":"ContainerDied","Data":"7824b329785f2f383c95161a6e5a5441162ca8f13e4af13a30f3de135a379142"} Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.152497 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7824b329785f2f383c95161a6e5a5441162ca8f13e4af13a30f3de135a379142" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.152498 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e40-account-create-update-c8zqv" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.154260 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ctpbj" event={"ID":"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c","Type":"ContainerStarted","Data":"38f8de90e88a2382619b837ef63b573edfd9808dacdc190631ae0849effe576d"} Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.157203 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-45c6-account-create-update-cz7k8" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.157209 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-45c6-account-create-update-cz7k8" event={"ID":"31e6114e-103b-4653-b4f0-ba3c216e3437","Type":"ContainerDied","Data":"6d8c836f039aa4e043bc5a7235765b1f07dcf9d616866b7884e04f81d18b6d90"} Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.157359 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d8c836f039aa4e043bc5a7235765b1f07dcf9d616866b7884e04f81d18b6d90" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.160287 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7whbn" event={"ID":"0c190e44-b111-4a65-9700-d0255aa11800","Type":"ContainerDied","Data":"f1dd8035173539e4ab9327d3e4e791f42f17052c744126cac9d87ef908b6612e"} Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.160339 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1dd8035173539e4ab9327d3e4e791f42f17052c744126cac9d87ef908b6612e" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.160467 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7whbn" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.163343 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-105f-account-create-update-g2tcv" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.163369 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-105f-account-create-update-g2tcv" event={"ID":"dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15","Type":"ContainerDied","Data":"b2d9c4b3f013d4658b906a60873db0224adb1772515ba4abfc3a8a7b6f63c888"} Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.163417 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2d9c4b3f013d4658b906a60873db0224adb1772515ba4abfc3a8a7b6f63c888" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.163344 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zcrjg" Mar 10 22:11:42 crc kubenswrapper[4919]: I0310 22:11:42.176936 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ctpbj" podStartSLOduration=6.122098124 podStartE2EDuration="10.176909845s" podCreationTimestamp="2026-03-10 22:11:32 +0000 UTC" firstStartedPulling="2026-03-10 22:11:37.152811343 +0000 UTC m=+1284.394691951" lastFinishedPulling="2026-03-10 22:11:41.207623064 +0000 UTC m=+1288.449503672" observedRunningTime="2026-03-10 22:11:42.168649271 +0000 UTC m=+1289.410529879" watchObservedRunningTime="2026-03-10 22:11:42.176909845 +0000 UTC m=+1289.418790473" Mar 10 22:11:45 crc kubenswrapper[4919]: I0310 22:11:45.202027 4919 generic.go:334] "Generic (PLEG): container finished" podID="f8965b29-3ef4-4db7-a67f-d905fe2e8c2c" containerID="38f8de90e88a2382619b837ef63b573edfd9808dacdc190631ae0849effe576d" exitCode=0 Mar 10 22:11:45 crc kubenswrapper[4919]: I0310 22:11:45.202110 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ctpbj" event={"ID":"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c","Type":"ContainerDied","Data":"38f8de90e88a2382619b837ef63b573edfd9808dacdc190631ae0849effe576d"} Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.523100 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.668746 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5z2x\" (UniqueName: \"kubernetes.io/projected/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-kube-api-access-h5z2x\") pod \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.668853 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-config-data\") pod \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.668872 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-combined-ca-bundle\") pod \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\" (UID: \"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c\") " Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.675314 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-kube-api-access-h5z2x" (OuterVolumeSpecName: "kube-api-access-h5z2x") pod "f8965b29-3ef4-4db7-a67f-d905fe2e8c2c" (UID: "f8965b29-3ef4-4db7-a67f-d905fe2e8c2c"). InnerVolumeSpecName "kube-api-access-h5z2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.704066 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8965b29-3ef4-4db7-a67f-d905fe2e8c2c" (UID: "f8965b29-3ef4-4db7-a67f-d905fe2e8c2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.711718 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-config-data" (OuterVolumeSpecName: "config-data") pod "f8965b29-3ef4-4db7-a67f-d905fe2e8c2c" (UID: "f8965b29-3ef4-4db7-a67f-d905fe2e8c2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.771031 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5z2x\" (UniqueName: \"kubernetes.io/projected/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-kube-api-access-h5z2x\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.771263 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:46 crc kubenswrapper[4919]: I0310 22:11:46.771342 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.224501 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ctpbj" event={"ID":"f8965b29-3ef4-4db7-a67f-d905fe2e8c2c","Type":"ContainerDied","Data":"5c69c9373fd806d24fc0ce727b9c0866247f8de31bc961caf61513638ae51d31"} Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.224548 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c69c9373fd806d24fc0ce727b9c0866247f8de31bc961caf61513638ae51d31" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.224552 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ctpbj" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.510959 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m7z2j"] Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511370 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bad7d2b-c98f-49e7-86a8-3467f75830f2" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511385 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bad7d2b-c98f-49e7-86a8-3467f75830f2" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511430 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerName="init" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511439 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerName="init" Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511465 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerName="dnsmasq-dns" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511476 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerName="dnsmasq-dns" Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511492 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8965b29-3ef4-4db7-a67f-d905fe2e8c2c" containerName="keystone-db-sync" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511503 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8965b29-3ef4-4db7-a67f-d905fe2e8c2c" containerName="keystone-db-sync" Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511513 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5759d1d2-d713-4e24-a2fb-c1c6804a4c39" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511522 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5759d1d2-d713-4e24-a2fb-c1c6804a4c39" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511537 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c190e44-b111-4a65-9700-d0255aa11800" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511545 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c190e44-b111-4a65-9700-d0255aa11800" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511561 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511571 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511584 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e6114e-103b-4653-b4f0-ba3c216e3437" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511592 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e6114e-103b-4653-b4f0-ba3c216e3437" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: E0310 22:11:47.511607 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c9d4c1-5253-49d5-8ade-272d01956b72" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511616 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c9d4c1-5253-49d5-8ade-272d01956b72" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511809 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c190e44-b111-4a65-9700-d0255aa11800" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511830 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fdc6e48-f4e3-4c70-8cab-78ad58edd483" containerName="dnsmasq-dns" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511839 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5759d1d2-d713-4e24-a2fb-c1c6804a4c39" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511854 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c9d4c1-5253-49d5-8ade-272d01956b72" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511870 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8965b29-3ef4-4db7-a67f-d905fe2e8c2c" containerName="keystone-db-sync" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511879 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bad7d2b-c98f-49e7-86a8-3467f75830f2" containerName="mariadb-database-create" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511889 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e6114e-103b-4653-b4f0-ba3c216e3437" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.511904 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15" containerName="mariadb-account-create-update" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.512538 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.515889 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.515949 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.516360 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nbmmg" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.516533 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.525367 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m7z2j"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.525529 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.561220 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666fdfc96f-2jjfq"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.562788 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.592160 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-credential-keys\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.592292 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnl6c\" (UniqueName: \"kubernetes.io/projected/689c6775-e11f-4644-bc89-dbef2174b343-kube-api-access-pnl6c\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.592322 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-combined-ca-bundle\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.592467 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-scripts\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.592497 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-fernet-keys\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.592525 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-config-data\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.594101 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666fdfc96f-2jjfq"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.681678 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jbk56"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.684455 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.688871 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-phplg" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.688871 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.689202 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.696699 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-config-data\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.696796 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ttpk\" (UniqueName: \"kubernetes.io/projected/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-kube-api-access-7ttpk\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.696882 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-credential-keys\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.696944 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-svc\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.697061 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-sb\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.697146 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-nb\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.697209 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnl6c\" (UniqueName: \"kubernetes.io/projected/689c6775-e11f-4644-bc89-dbef2174b343-kube-api-access-pnl6c\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.697266 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-combined-ca-bundle\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.697364 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-swift-storage-0\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.697439 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-config\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.697515 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-scripts\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.697554 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-fernet-keys\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.706057 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jbk56"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.715516 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-credential-keys\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.718794 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-config-data\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.726542 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-fernet-keys\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.731261 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-combined-ca-bundle\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.743990 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-scripts\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.744379 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnl6c\" (UniqueName: \"kubernetes.io/projected/689c6775-e11f-4644-bc89-dbef2174b343-kube-api-access-pnl6c\") pod \"keystone-bootstrap-m7z2j\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.765466 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.767769 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.774846 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.775014 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.775814 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.798784 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-config\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.798833 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-nb\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.798887 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-swift-storage-0\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.798906 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-config\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.798951 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ttpk\" (UniqueName: \"kubernetes.io/projected/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-kube-api-access-7ttpk\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.798980 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-combined-ca-bundle\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.799002 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-svc\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.799024 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmwzx\" (UniqueName: \"kubernetes.io/projected/5384b251-66f7-451a-ab29-0b88b8207838-kube-api-access-kmwzx\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.799050 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-sb\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.800098 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-sb\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.801020 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-nb\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.801576 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-swift-storage-0\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.802080 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-config\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.816818 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-svc\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.833706 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-7xmvf"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.834437 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.835080 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.843920 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pxw4p" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.843971 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.851748 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.863049 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ttpk\" (UniqueName: \"kubernetes.io/projected/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-kube-api-access-7ttpk\") pod \"dnsmasq-dns-666fdfc96f-2jjfq\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.883299 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.895798 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7xmvf"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900076 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-run-httpd\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900159 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900195 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9lm\" (UniqueName: \"kubernetes.io/projected/a35812c5-ffc7-4307-ab31-390c9ee39262-kube-api-access-4h9lm\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900228 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-scripts\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900251 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-log-httpd\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900298 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-combined-ca-bundle\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900443 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmwzx\" (UniqueName: \"kubernetes.io/projected/5384b251-66f7-451a-ab29-0b88b8207838-kube-api-access-kmwzx\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900552 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-config-data\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900579 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-config\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.900616 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.906596 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-config\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.907257 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-combined-ca-bundle\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.910936 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tbkwn"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.912276 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.914216 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.914987 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jntrm" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.933929 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmwzx\" (UniqueName: \"kubernetes.io/projected/5384b251-66f7-451a-ab29-0b88b8207838-kube-api-access-kmwzx\") pod \"neutron-db-sync-jbk56\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.950150 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tbkwn"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.970042 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666fdfc96f-2jjfq"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.995783 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844b6c58c7-qwqvv"] Mar 10 22:11:47 crc kubenswrapper[4919]: I0310 22:11:47.997135 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002202 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-combined-ca-bundle\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002256 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/0376622a-15ed-42d8-98b9-ffa1138134ee-kube-api-access-gkckt\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002293 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-db-sync-config-data\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002312 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-db-sync-config-data\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002329 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwvfm\" (UniqueName: \"kubernetes.io/projected/15160303-4913-49a5-8cd3-e8255ba657f6-kube-api-access-wwvfm\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002358 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-scripts\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002400 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-config-data\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002424 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-config-data\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002442 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002473 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-run-httpd\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002507 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002522 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0376622a-15ed-42d8-98b9-ffa1138134ee-etc-machine-id\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002542 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9lm\" (UniqueName: \"kubernetes.io/projected/a35812c5-ffc7-4307-ab31-390c9ee39262-kube-api-access-4h9lm\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002570 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-combined-ca-bundle\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002596 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-scripts\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.002613 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-log-httpd\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.003027 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-log-httpd\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.005619 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844b6c58c7-qwqvv"] Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.005989 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-run-httpd\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.016169 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-scripts\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.017016 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.017665 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.027665 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-q9dwb"] Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.030196 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9lm\" (UniqueName: \"kubernetes.io/projected/a35812c5-ffc7-4307-ab31-390c9ee39262-kube-api-access-4h9lm\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.031354 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-config-data\") pod \"ceilometer-0\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.032063 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.038073 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.038480 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.038666 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f2pl6" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.038866 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q9dwb"] Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.103630 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-config-data\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.103893 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h87hg\" (UniqueName: \"kubernetes.io/projected/e8b579d7-131a-4166-a67f-19294e5f652b-kube-api-access-h87hg\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.103917 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-combined-ca-bundle\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.103952 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-sb\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.103972 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-config\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.103998 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0376622a-15ed-42d8-98b9-ffa1138134ee-etc-machine-id\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104014 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-config-data\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104054 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0376622a-15ed-42d8-98b9-ffa1138134ee-etc-machine-id\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104106 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-combined-ca-bundle\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104136 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-nb\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104166 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-svc\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104188 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-swift-storage-0\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104207 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-scripts\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104225 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-combined-ca-bundle\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104323 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/0376622a-15ed-42d8-98b9-ffa1138134ee-kube-api-access-gkckt\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104348 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhqf\" (UniqueName: \"kubernetes.io/projected/02d206af-330a-4526-8a3e-7826a1acb153-kube-api-access-6qhqf\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104414 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-db-sync-config-data\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104434 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-db-sync-config-data\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104449 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwvfm\" (UniqueName: \"kubernetes.io/projected/15160303-4913-49a5-8cd3-e8255ba657f6-kube-api-access-wwvfm\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104476 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-scripts\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.104495 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02d206af-330a-4526-8a3e-7826a1acb153-logs\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.107910 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-scripts\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.108066 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-config-data\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.108092 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-combined-ca-bundle\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.108952 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-combined-ca-bundle\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.109280 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-db-sync-config-data\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.110922 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-db-sync-config-data\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.119439 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jbk56" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.120129 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/0376622a-15ed-42d8-98b9-ffa1138134ee-kube-api-access-gkckt\") pod \"cinder-db-sync-7xmvf\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.121749 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwvfm\" (UniqueName: \"kubernetes.io/projected/15160303-4913-49a5-8cd3-e8255ba657f6-kube-api-access-wwvfm\") pod \"barbican-db-sync-tbkwn\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.138843 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205591 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-svc\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205637 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-swift-storage-0\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205656 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-scripts\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205682 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhqf\" (UniqueName: \"kubernetes.io/projected/02d206af-330a-4526-8a3e-7826a1acb153-kube-api-access-6qhqf\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205748 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02d206af-330a-4526-8a3e-7826a1acb153-logs\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205778 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h87hg\" (UniqueName: \"kubernetes.io/projected/e8b579d7-131a-4166-a67f-19294e5f652b-kube-api-access-h87hg\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205799 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-combined-ca-bundle\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205831 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-sb\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205850 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-config\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205873 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-config-data\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.205921 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-nb\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.206844 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-nb\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.207021 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-svc\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.207733 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02d206af-330a-4526-8a3e-7826a1acb153-logs\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.207765 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-sb\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.207827 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-config\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.208150 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-swift-storage-0\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.214019 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-scripts\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.219048 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-combined-ca-bundle\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.220024 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-config-data\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.224974 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhqf\" (UniqueName: \"kubernetes.io/projected/02d206af-330a-4526-8a3e-7826a1acb153-kube-api-access-6qhqf\") pod \"placement-db-sync-q9dwb\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.236155 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h87hg\" (UniqueName: \"kubernetes.io/projected/e8b579d7-131a-4166-a67f-19294e5f652b-kube-api-access-h87hg\") pod \"dnsmasq-dns-844b6c58c7-qwqvv\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.250900 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.279086 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.330979 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.360118 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q9dwb" Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.441570 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m7z2j"] Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.544877 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666fdfc96f-2jjfq"] Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.664413 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jbk56"] Mar 10 22:11:48 crc kubenswrapper[4919]: W0310 22:11:48.667912 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5384b251_66f7_451a_ab29_0b88b8207838.slice/crio-bfb4baa4ea386306dc6014e09ef3053d816ba9f67c4e4dc4c35a2d61acf1019d WatchSource:0}: Error finding container bfb4baa4ea386306dc6014e09ef3053d816ba9f67c4e4dc4c35a2d61acf1019d: Status 404 returned error can't find the container with id bfb4baa4ea386306dc6014e09ef3053d816ba9f67c4e4dc4c35a2d61acf1019d Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.841621 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.859794 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tbkwn"] Mar 10 22:11:48 crc kubenswrapper[4919]: I0310 22:11:48.868997 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7xmvf"] Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.058892 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q9dwb"] Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.114679 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844b6c58c7-qwqvv"] Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.249977 4919 generic.go:334] "Generic (PLEG): container finished" podID="14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" containerID="df3255a2d0759e4981bfd7e03b09952206e74469af65e2de0fe714948a7f652b" exitCode=0 Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.250097 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rcb6h" event={"ID":"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5","Type":"ContainerDied","Data":"df3255a2d0759e4981bfd7e03b09952206e74469af65e2de0fe714948a7f652b"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.265590 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jbk56" event={"ID":"5384b251-66f7-451a-ab29-0b88b8207838","Type":"ContainerStarted","Data":"236e9edb8142b4785375f2f9d21591aeca381142891f25109c78d197b1c4208e"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.265634 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jbk56" event={"ID":"5384b251-66f7-451a-ab29-0b88b8207838","Type":"ContainerStarted","Data":"bfb4baa4ea386306dc6014e09ef3053d816ba9f67c4e4dc4c35a2d61acf1019d"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.272289 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q9dwb" event={"ID":"02d206af-330a-4526-8a3e-7826a1acb153","Type":"ContainerStarted","Data":"eb052bc847f69c3de26e1e830915dc4812075b65e200888bd4dbf9d33ac585b3"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.274075 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" event={"ID":"e8b579d7-131a-4166-a67f-19294e5f652b","Type":"ContainerStarted","Data":"ffe3cf7a86c6a0dbe148f65434f789866d9cf96005b6dab82178a1f00c0bdaba"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.276020 4919 generic.go:334] "Generic (PLEG): container finished" podID="1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" containerID="6559cf79bad4a312de9e816a29bbf4570fea7d4640ecebe3e3d4be9e603b66dd" exitCode=0 Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.276056 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" event={"ID":"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175","Type":"ContainerDied","Data":"6559cf79bad4a312de9e816a29bbf4570fea7d4640ecebe3e3d4be9e603b66dd"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.276088 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" event={"ID":"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175","Type":"ContainerStarted","Data":"836856f48f4347e7a676029f9d72799db658e421ce62c8347f6db39b00991c9a"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.277383 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a35812c5-ffc7-4307-ab31-390c9ee39262","Type":"ContainerStarted","Data":"24b773789dcd5611dbb3954ea8be750ba760bcc5a05ebaf48d75bd37a179eb15"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.278757 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7xmvf" event={"ID":"0376622a-15ed-42d8-98b9-ffa1138134ee","Type":"ContainerStarted","Data":"9d1e28e3d49d196291959241d948da3cfbdce4b0e3d699a85ba9a2bb6ef25c93"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.295233 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m7z2j" event={"ID":"689c6775-e11f-4644-bc89-dbef2174b343","Type":"ContainerStarted","Data":"aff95dc2fe60966146b7d53443620fb83fd82b6ea8fa70ab79dfd44bdb8d6acd"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.295273 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m7z2j" event={"ID":"689c6775-e11f-4644-bc89-dbef2174b343","Type":"ContainerStarted","Data":"a35a298fb74e90712fbc599ff843e18ed03e676b098cf21bd5e659d1d92afcb8"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.297329 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tbkwn" event={"ID":"15160303-4913-49a5-8cd3-e8255ba657f6","Type":"ContainerStarted","Data":"bd4430a632a18966644e9e9420a2112f3db08d4877a829c963e20f4026763cbe"} Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.308367 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jbk56" podStartSLOduration=2.308334663 podStartE2EDuration="2.308334663s" podCreationTimestamp="2026-03-10 22:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:49.283704556 +0000 UTC m=+1296.525585164" watchObservedRunningTime="2026-03-10 22:11:49.308334663 +0000 UTC m=+1296.550215261" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.330189 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m7z2j" podStartSLOduration=2.330166375 podStartE2EDuration="2.330166375s" podCreationTimestamp="2026-03-10 22:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:49.324980444 +0000 UTC m=+1296.566861052" watchObservedRunningTime="2026-03-10 22:11:49.330166375 +0000 UTC m=+1296.572046993" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.550083 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.646665 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-nb\") pod \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.646735 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-swift-storage-0\") pod \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.646804 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-sb\") pod \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.646862 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-config\") pod \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.646907 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ttpk\" (UniqueName: \"kubernetes.io/projected/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-kube-api-access-7ttpk\") pod \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.646942 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-svc\") pod \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\" (UID: \"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175\") " Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.659203 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-kube-api-access-7ttpk" (OuterVolumeSpecName: "kube-api-access-7ttpk") pod "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" (UID: "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175"). InnerVolumeSpecName "kube-api-access-7ttpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.666156 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" (UID: "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.666424 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" (UID: "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.670207 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" (UID: "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.678281 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-config" (OuterVolumeSpecName: "config") pod "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" (UID: "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.683488 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" (UID: "1d67c30e-12d3-4a7d-ae8a-eb62d08b4175"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.748435 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.748474 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.748490 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.748502 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.748514 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ttpk\" (UniqueName: \"kubernetes.io/projected/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-kube-api-access-7ttpk\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:49 crc kubenswrapper[4919]: I0310 22:11:49.748527 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.143106 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.307311 4919 generic.go:334] "Generic (PLEG): container finished" podID="e8b579d7-131a-4166-a67f-19294e5f652b" containerID="297bc3d72ba87fdb7afd0822f6f4977d2c29fb3174578f44bb1daa871418cc79" exitCode=0 Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.307359 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" event={"ID":"e8b579d7-131a-4166-a67f-19294e5f652b","Type":"ContainerDied","Data":"297bc3d72ba87fdb7afd0822f6f4977d2c29fb3174578f44bb1daa871418cc79"} Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.316902 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" event={"ID":"1d67c30e-12d3-4a7d-ae8a-eb62d08b4175","Type":"ContainerDied","Data":"836856f48f4347e7a676029f9d72799db658e421ce62c8347f6db39b00991c9a"} Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.316978 4919 scope.go:117] "RemoveContainer" containerID="6559cf79bad4a312de9e816a29bbf4570fea7d4640ecebe3e3d4be9e603b66dd" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.317214 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666fdfc96f-2jjfq" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.547717 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666fdfc96f-2jjfq"] Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.552297 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666fdfc96f-2jjfq"] Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.805928 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.886505 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-combined-ca-bundle\") pod \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.886740 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-config-data\") pod \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.886788 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fsrk\" (UniqueName: \"kubernetes.io/projected/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-kube-api-access-5fsrk\") pod \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.886855 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-db-sync-config-data\") pod \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\" (UID: \"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5\") " Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.900729 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" (UID: "14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.907143 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-kube-api-access-5fsrk" (OuterVolumeSpecName: "kube-api-access-5fsrk") pod "14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" (UID: "14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5"). InnerVolumeSpecName "kube-api-access-5fsrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.953610 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-config-data" (OuterVolumeSpecName: "config-data") pod "14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" (UID: "14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.954479 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" (UID: "14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.988602 4919 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.988648 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.988662 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:50 crc kubenswrapper[4919]: I0310 22:11:50.988673 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fsrk\" (UniqueName: \"kubernetes.io/projected/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5-kube-api-access-5fsrk\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.379622 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" event={"ID":"e8b579d7-131a-4166-a67f-19294e5f652b","Type":"ContainerStarted","Data":"7a12622b612003ee98c91af7c0261cc3b551a6486bf75cdf0be94bc4effb04c2"} Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.379789 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.382301 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rcb6h" event={"ID":"14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5","Type":"ContainerDied","Data":"e983971d2011f0819a459aa42e888c5f455b8dc884665ad05e34206351b9fe20"} Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.382354 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e983971d2011f0819a459aa42e888c5f455b8dc884665ad05e34206351b9fe20" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.382497 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rcb6h" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.415512 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" podStartSLOduration=4.415488664 podStartE2EDuration="4.415488664s" podCreationTimestamp="2026-03-10 22:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:11:51.410290303 +0000 UTC m=+1298.652170911" watchObservedRunningTime="2026-03-10 22:11:51.415488664 +0000 UTC m=+1298.657369272" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.523243 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" path="/var/lib/kubelet/pods/1d67c30e-12d3-4a7d-ae8a-eb62d08b4175/volumes" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.675585 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844b6c58c7-qwqvv"] Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.704471 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-754b99d75-xsfht"] Mar 10 22:11:51 crc kubenswrapper[4919]: E0310 22:11:51.704984 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" containerName="init" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.705005 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" containerName="init" Mar 10 22:11:51 crc kubenswrapper[4919]: E0310 22:11:51.705031 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" containerName="glance-db-sync" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.705040 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" containerName="glance-db-sync" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.705229 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d67c30e-12d3-4a7d-ae8a-eb62d08b4175" containerName="init" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.705252 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" containerName="glance-db-sync" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.706429 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.712850 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-xsfht"] Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.811525 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-sb\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.811582 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-config\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.811642 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs7k2\" (UniqueName: \"kubernetes.io/projected/48f4d802-20e1-4c18-ae02-0b82adf97457-kube-api-access-qs7k2\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.811667 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-svc\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.811856 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-swift-storage-0\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.812032 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-nb\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.914080 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-sb\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.914145 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-config\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.914192 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs7k2\" (UniqueName: \"kubernetes.io/projected/48f4d802-20e1-4c18-ae02-0b82adf97457-kube-api-access-qs7k2\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.914217 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-svc\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.914242 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-swift-storage-0\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.914278 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-nb\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.915177 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-svc\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.915227 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-nb\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.915417 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-swift-storage-0\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.915489 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-config\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.915900 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-sb\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:51 crc kubenswrapper[4919]: I0310 22:11:51.932447 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs7k2\" (UniqueName: \"kubernetes.io/projected/48f4d802-20e1-4c18-ae02-0b82adf97457-kube-api-access-qs7k2\") pod \"dnsmasq-dns-754b99d75-xsfht\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.027971 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.586456 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.588141 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.592253 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.595333 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xjqfl" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.595814 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.597871 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.728153 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7sqk\" (UniqueName: \"kubernetes.io/projected/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-kube-api-access-h7sqk\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.728209 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.728228 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-scripts\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.728688 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-logs\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.728721 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.728838 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-config-data\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.728874 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.830051 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-logs\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.830375 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.830430 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-config-data\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.830450 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.830717 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-logs\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.830734 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.831679 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.833310 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.838921 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.843680 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-config-data\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.845485 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7sqk\" (UniqueName: \"kubernetes.io/projected/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-kube-api-access-h7sqk\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.845607 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.845632 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-scripts\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.846638 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.852332 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.854127 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.857510 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.871170 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-scripts\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.874961 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7sqk\" (UniqueName: \"kubernetes.io/projected/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-kube-api-access-h7sqk\") pod \"glance-default-external-api-0\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.907476 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.947579 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsfkp\" (UniqueName: \"kubernetes.io/projected/5f89ffa1-3600-4fff-965c-4b754167fed1-kube-api-access-qsfkp\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.947626 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.947648 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.947685 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.947714 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.947755 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:52 crc kubenswrapper[4919]: I0310 22:11:52.947777 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.050006 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.050060 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.050162 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsfkp\" (UniqueName: \"kubernetes.io/projected/5f89ffa1-3600-4fff-965c-4b754167fed1-kube-api-access-qsfkp\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.050199 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.050228 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.050276 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.050316 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.050520 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.052075 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.053354 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.054793 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.055661 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.061472 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.067909 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsfkp\" (UniqueName: \"kubernetes.io/projected/5f89ffa1-3600-4fff-965c-4b754167fed1-kube-api-access-qsfkp\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.076921 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.237207 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.407372 4919 generic.go:334] "Generic (PLEG): container finished" podID="689c6775-e11f-4644-bc89-dbef2174b343" containerID="aff95dc2fe60966146b7d53443620fb83fd82b6ea8fa70ab79dfd44bdb8d6acd" exitCode=0 Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.407475 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m7z2j" event={"ID":"689c6775-e11f-4644-bc89-dbef2174b343","Type":"ContainerDied","Data":"aff95dc2fe60966146b7d53443620fb83fd82b6ea8fa70ab79dfd44bdb8d6acd"} Mar 10 22:11:53 crc kubenswrapper[4919]: I0310 22:11:53.407603 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="dnsmasq-dns" containerID="cri-o://7a12622b612003ee98c91af7c0261cc3b551a6486bf75cdf0be94bc4effb04c2" gracePeriod=10 Mar 10 22:11:54 crc kubenswrapper[4919]: I0310 22:11:54.421856 4919 generic.go:334] "Generic (PLEG): container finished" podID="e8b579d7-131a-4166-a67f-19294e5f652b" containerID="7a12622b612003ee98c91af7c0261cc3b551a6486bf75cdf0be94bc4effb04c2" exitCode=0 Mar 10 22:11:54 crc kubenswrapper[4919]: I0310 22:11:54.421935 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" event={"ID":"e8b579d7-131a-4166-a67f-19294e5f652b","Type":"ContainerDied","Data":"7a12622b612003ee98c91af7c0261cc3b551a6486bf75cdf0be94bc4effb04c2"} Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.130594 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.137120 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.239839 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.247123 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-fernet-keys\") pod \"689c6775-e11f-4644-bc89-dbef2174b343\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.247249 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-config-data\") pod \"689c6775-e11f-4644-bc89-dbef2174b343\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.247345 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-combined-ca-bundle\") pod \"689c6775-e11f-4644-bc89-dbef2174b343\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.247516 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-credential-keys\") pod \"689c6775-e11f-4644-bc89-dbef2174b343\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.247737 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-scripts\") pod \"689c6775-e11f-4644-bc89-dbef2174b343\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.247795 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnl6c\" (UniqueName: \"kubernetes.io/projected/689c6775-e11f-4644-bc89-dbef2174b343-kube-api-access-pnl6c\") pod \"689c6775-e11f-4644-bc89-dbef2174b343\" (UID: \"689c6775-e11f-4644-bc89-dbef2174b343\") " Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.257007 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "689c6775-e11f-4644-bc89-dbef2174b343" (UID: "689c6775-e11f-4644-bc89-dbef2174b343"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.257334 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "689c6775-e11f-4644-bc89-dbef2174b343" (UID: "689c6775-e11f-4644-bc89-dbef2174b343"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.271780 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-scripts" (OuterVolumeSpecName: "scripts") pod "689c6775-e11f-4644-bc89-dbef2174b343" (UID: "689c6775-e11f-4644-bc89-dbef2174b343"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.283509 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689c6775-e11f-4644-bc89-dbef2174b343-kube-api-access-pnl6c" (OuterVolumeSpecName: "kube-api-access-pnl6c") pod "689c6775-e11f-4644-bc89-dbef2174b343" (UID: "689c6775-e11f-4644-bc89-dbef2174b343"). InnerVolumeSpecName "kube-api-access-pnl6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.290377 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-config-data" (OuterVolumeSpecName: "config-data") pod "689c6775-e11f-4644-bc89-dbef2174b343" (UID: "689c6775-e11f-4644-bc89-dbef2174b343"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.299640 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "689c6775-e11f-4644-bc89-dbef2174b343" (UID: "689c6775-e11f-4644-bc89-dbef2174b343"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.356437 4919 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.356478 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.356491 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.356506 4919 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.356517 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689c6775-e11f-4644-bc89-dbef2174b343-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.356528 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnl6c\" (UniqueName: \"kubernetes.io/projected/689c6775-e11f-4644-bc89-dbef2174b343-kube-api-access-pnl6c\") on node \"crc\" DevicePath \"\"" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.457131 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m7z2j" event={"ID":"689c6775-e11f-4644-bc89-dbef2174b343","Type":"ContainerDied","Data":"a35a298fb74e90712fbc599ff843e18ed03e676b098cf21bd5e659d1d92afcb8"} Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.457184 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a35a298fb74e90712fbc599ff843e18ed03e676b098cf21bd5e659d1d92afcb8" Mar 10 22:11:58 crc kubenswrapper[4919]: I0310 22:11:58.457223 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m7z2j" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.175700 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.175744 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.237572 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m7z2j"] Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.244823 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m7z2j"] Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.327686 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k96hb"] Mar 10 22:11:59 crc kubenswrapper[4919]: E0310 22:11:59.328080 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689c6775-e11f-4644-bc89-dbef2174b343" containerName="keystone-bootstrap" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.328100 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="689c6775-e11f-4644-bc89-dbef2174b343" containerName="keystone-bootstrap" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.330872 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="689c6775-e11f-4644-bc89-dbef2174b343" containerName="keystone-bootstrap" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.331662 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.334119 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.334318 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.334563 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.334747 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nbmmg" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.334886 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.349035 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k96hb"] Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.477267 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-scripts\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.477341 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-combined-ca-bundle\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.477469 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-fernet-keys\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.477513 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-config-data\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.477540 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncn7m\" (UniqueName: \"kubernetes.io/projected/b20682db-f5f4-4102-b0f9-662aeab1bd2a-kube-api-access-ncn7m\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.477612 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-credential-keys\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.490466 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689c6775-e11f-4644-bc89-dbef2174b343" path="/var/lib/kubelet/pods/689c6775-e11f-4644-bc89-dbef2174b343/volumes" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.578940 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-fernet-keys\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.579006 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncn7m\" (UniqueName: \"kubernetes.io/projected/b20682db-f5f4-4102-b0f9-662aeab1bd2a-kube-api-access-ncn7m\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.579028 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-config-data\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.579088 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-credential-keys\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.579150 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-scripts\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.579191 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-combined-ca-bundle\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.584263 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-combined-ca-bundle\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.585438 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-fernet-keys\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.589870 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-scripts\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.595965 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-config-data\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.601677 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncn7m\" (UniqueName: \"kubernetes.io/projected/b20682db-f5f4-4102-b0f9-662aeab1bd2a-kube-api-access-ncn7m\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.603148 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-credential-keys\") pod \"keystone-bootstrap-k96hb\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:11:59 crc kubenswrapper[4919]: I0310 22:11:59.650677 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.129862 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553012-5ftd6"] Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.131822 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.144225 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.144318 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.145486 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.145647 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553012-5ftd6"] Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.291140 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnq8k\" (UniqueName: \"kubernetes.io/projected/76635374-85f5-4577-866f-5f561c5223df-kube-api-access-nnq8k\") pod \"auto-csr-approver-29553012-5ftd6\" (UID: \"76635374-85f5-4577-866f-5f561c5223df\") " pod="openshift-infra/auto-csr-approver-29553012-5ftd6" Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.392978 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnq8k\" (UniqueName: \"kubernetes.io/projected/76635374-85f5-4577-866f-5f561c5223df-kube-api-access-nnq8k\") pod \"auto-csr-approver-29553012-5ftd6\" (UID: \"76635374-85f5-4577-866f-5f561c5223df\") " pod="openshift-infra/auto-csr-approver-29553012-5ftd6" Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.415453 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnq8k\" (UniqueName: \"kubernetes.io/projected/76635374-85f5-4577-866f-5f561c5223df-kube-api-access-nnq8k\") pod \"auto-csr-approver-29553012-5ftd6\" (UID: \"76635374-85f5-4577-866f-5f561c5223df\") " pod="openshift-infra/auto-csr-approver-29553012-5ftd6" Mar 10 22:12:00 crc kubenswrapper[4919]: I0310 22:12:00.472092 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" Mar 10 22:12:02 crc kubenswrapper[4919]: E0310 22:12:02.750670 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81" Mar 10 22:12:02 crc kubenswrapper[4919]: E0310 22:12:02.750927 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n669hd8h5bbh5f4h5f7hbfh674h696h56hb7h64dhb6h66fhf9h5fch99h5d5h675h54bh666h5b9h5f8h576h9fh8bh59bh86hb4h654h647h595h557q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4h9lm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a35812c5-ffc7-4307-ab31-390c9ee39262): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 22:12:03 crc kubenswrapper[4919]: I0310 22:12:03.332320 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Mar 10 22:12:08 crc kubenswrapper[4919]: I0310 22:12:08.333575 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.553137 4919 generic.go:334] "Generic (PLEG): container finished" podID="5384b251-66f7-451a-ab29-0b88b8207838" containerID="236e9edb8142b4785375f2f9d21591aeca381142891f25109c78d197b1c4208e" exitCode=0 Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.553240 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jbk56" event={"ID":"5384b251-66f7-451a-ab29-0b88b8207838","Type":"ContainerDied","Data":"236e9edb8142b4785375f2f9d21591aeca381142891f25109c78d197b1c4208e"} Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.673532 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.799611 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-nb\") pod \"e8b579d7-131a-4166-a67f-19294e5f652b\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.799703 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h87hg\" (UniqueName: \"kubernetes.io/projected/e8b579d7-131a-4166-a67f-19294e5f652b-kube-api-access-h87hg\") pod \"e8b579d7-131a-4166-a67f-19294e5f652b\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.799737 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-swift-storage-0\") pod \"e8b579d7-131a-4166-a67f-19294e5f652b\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.799764 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-svc\") pod \"e8b579d7-131a-4166-a67f-19294e5f652b\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.799872 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-config\") pod \"e8b579d7-131a-4166-a67f-19294e5f652b\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.800014 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-sb\") pod \"e8b579d7-131a-4166-a67f-19294e5f652b\" (UID: \"e8b579d7-131a-4166-a67f-19294e5f652b\") " Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.814592 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b579d7-131a-4166-a67f-19294e5f652b-kube-api-access-h87hg" (OuterVolumeSpecName: "kube-api-access-h87hg") pod "e8b579d7-131a-4166-a67f-19294e5f652b" (UID: "e8b579d7-131a-4166-a67f-19294e5f652b"). InnerVolumeSpecName "kube-api-access-h87hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.843132 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-config" (OuterVolumeSpecName: "config") pod "e8b579d7-131a-4166-a67f-19294e5f652b" (UID: "e8b579d7-131a-4166-a67f-19294e5f652b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.848595 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8b579d7-131a-4166-a67f-19294e5f652b" (UID: "e8b579d7-131a-4166-a67f-19294e5f652b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.851227 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8b579d7-131a-4166-a67f-19294e5f652b" (UID: "e8b579d7-131a-4166-a67f-19294e5f652b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.853463 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8b579d7-131a-4166-a67f-19294e5f652b" (UID: "e8b579d7-131a-4166-a67f-19294e5f652b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.859183 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e8b579d7-131a-4166-a67f-19294e5f652b" (UID: "e8b579d7-131a-4166-a67f-19294e5f652b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.902250 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h87hg\" (UniqueName: \"kubernetes.io/projected/e8b579d7-131a-4166-a67f-19294e5f652b-kube-api-access-h87hg\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.902286 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.902299 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.902310 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.902321 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:10 crc kubenswrapper[4919]: I0310 22:12:10.902331 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8b579d7-131a-4166-a67f-19294e5f652b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:11 crc kubenswrapper[4919]: I0310 22:12:11.568797 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" event={"ID":"e8b579d7-131a-4166-a67f-19294e5f652b","Type":"ContainerDied","Data":"ffe3cf7a86c6a0dbe148f65434f789866d9cf96005b6dab82178a1f00c0bdaba"} Mar 10 22:12:11 crc kubenswrapper[4919]: I0310 22:12:11.568815 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" Mar 10 22:12:11 crc kubenswrapper[4919]: I0310 22:12:11.568857 4919 scope.go:117] "RemoveContainer" containerID="7a12622b612003ee98c91af7c0261cc3b551a6486bf75cdf0be94bc4effb04c2" Mar 10 22:12:11 crc kubenswrapper[4919]: I0310 22:12:11.597038 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844b6c58c7-qwqvv"] Mar 10 22:12:11 crc kubenswrapper[4919]: I0310 22:12:11.603745 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844b6c58c7-qwqvv"] Mar 10 22:12:12 crc kubenswrapper[4919]: E0310 22:12:12.046967 4919 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 10 22:12:12 crc kubenswrapper[4919]: E0310 22:12:12.047558 4919 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkckt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-7xmvf_openstack(0376622a-15ed-42d8-98b9-ffa1138134ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 22:12:12 crc kubenswrapper[4919]: E0310 22:12:12.048890 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-7xmvf" podUID="0376622a-15ed-42d8-98b9-ffa1138134ee" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.181332 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jbk56" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.327638 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmwzx\" (UniqueName: \"kubernetes.io/projected/5384b251-66f7-451a-ab29-0b88b8207838-kube-api-access-kmwzx\") pod \"5384b251-66f7-451a-ab29-0b88b8207838\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.327712 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-config\") pod \"5384b251-66f7-451a-ab29-0b88b8207838\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.327770 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-combined-ca-bundle\") pod \"5384b251-66f7-451a-ab29-0b88b8207838\" (UID: \"5384b251-66f7-451a-ab29-0b88b8207838\") " Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.333648 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5384b251-66f7-451a-ab29-0b88b8207838-kube-api-access-kmwzx" (OuterVolumeSpecName: "kube-api-access-kmwzx") pod "5384b251-66f7-451a-ab29-0b88b8207838" (UID: "5384b251-66f7-451a-ab29-0b88b8207838"). InnerVolumeSpecName "kube-api-access-kmwzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.364252 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5384b251-66f7-451a-ab29-0b88b8207838" (UID: "5384b251-66f7-451a-ab29-0b88b8207838"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.365278 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-config" (OuterVolumeSpecName: "config") pod "5384b251-66f7-451a-ab29-0b88b8207838" (UID: "5384b251-66f7-451a-ab29-0b88b8207838"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.429343 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmwzx\" (UniqueName: \"kubernetes.io/projected/5384b251-66f7-451a-ab29-0b88b8207838-kube-api-access-kmwzx\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.429405 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.429421 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5384b251-66f7-451a-ab29-0b88b8207838-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.533322 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-xsfht"] Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.546506 4919 scope.go:117] "RemoveContainer" containerID="297bc3d72ba87fdb7afd0822f6f4977d2c29fb3174578f44bb1daa871418cc79" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.580999 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jbk56" event={"ID":"5384b251-66f7-451a-ab29-0b88b8207838","Type":"ContainerDied","Data":"bfb4baa4ea386306dc6014e09ef3053d816ba9f67c4e4dc4c35a2d61acf1019d"} Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.581045 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb4baa4ea386306dc6014e09ef3053d816ba9f67c4e4dc4c35a2d61acf1019d" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.581017 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jbk56" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.583357 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-754b99d75-xsfht" event={"ID":"48f4d802-20e1-4c18-ae02-0b82adf97457","Type":"ContainerStarted","Data":"3a318003939a78b629a3311f943e30a06e1eed3705305c5e340d38141c3645bd"} Mar 10 22:12:12 crc kubenswrapper[4919]: E0310 22:12:12.598305 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-7xmvf" podUID="0376622a-15ed-42d8-98b9-ffa1138134ee" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.839748 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-xsfht"] Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.906886 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-v77sd"] Mar 10 22:12:12 crc kubenswrapper[4919]: E0310 22:12:12.907607 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="init" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.907623 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="init" Mar 10 22:12:12 crc kubenswrapper[4919]: E0310 22:12:12.907655 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5384b251-66f7-451a-ab29-0b88b8207838" containerName="neutron-db-sync" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.907663 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5384b251-66f7-451a-ab29-0b88b8207838" containerName="neutron-db-sync" Mar 10 22:12:12 crc kubenswrapper[4919]: E0310 22:12:12.907706 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="dnsmasq-dns" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.907718 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="dnsmasq-dns" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.908106 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="dnsmasq-dns" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.908143 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5384b251-66f7-451a-ab29-0b88b8207838" containerName="neutron-db-sync" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.910104 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.929777 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-v77sd"] Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.969475 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79589c5bbb-z9p5z"] Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.972539 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.977745 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-phplg" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.978077 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.978358 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 22:12:12 crc kubenswrapper[4919]: I0310 22:12:12.978662 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.000034 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79589c5bbb-z9p5z"] Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.044745 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-config\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.044802 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-svc\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.044897 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-sb\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.044922 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9zh\" (UniqueName: \"kubernetes.io/projected/43b80b86-6652-4a1a-8be6-7a5643e0bb45-kube-api-access-7r9zh\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.044961 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-swift-storage-0\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.044986 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-nb\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.126922 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k96hb"] Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.138360 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553012-5ftd6"] Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147543 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wx84\" (UniqueName: \"kubernetes.io/projected/4fe9eaba-0336-4655-b2d9-9bd67261da54-kube-api-access-7wx84\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147584 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-combined-ca-bundle\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147642 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-ovndb-tls-certs\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147676 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-config\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147697 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-svc\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147727 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-config\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147778 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-sb\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147793 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9zh\" (UniqueName: \"kubernetes.io/projected/43b80b86-6652-4a1a-8be6-7a5643e0bb45-kube-api-access-7r9zh\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147821 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-swift-storage-0\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147836 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-nb\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.147875 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-httpd-config\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.149713 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-config\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.150268 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-svc\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.151354 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-nb\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.151914 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-sb\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.164636 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-swift-storage-0\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.187014 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.187271 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9zh\" (UniqueName: \"kubernetes.io/projected/43b80b86-6652-4a1a-8be6-7a5643e0bb45-kube-api-access-7r9zh\") pod \"dnsmasq-dns-66ff44db99-v77sd\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.249317 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-ovndb-tls-certs\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.249397 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-config\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.249479 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-httpd-config\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.249510 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wx84\" (UniqueName: \"kubernetes.io/projected/4fe9eaba-0336-4655-b2d9-9bd67261da54-kube-api-access-7wx84\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.249529 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-combined-ca-bundle\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.267953 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-httpd-config\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.269909 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-ovndb-tls-certs\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.275300 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.286603 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-combined-ca-bundle\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.293235 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-config\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.309868 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wx84\" (UniqueName: \"kubernetes.io/projected/4fe9eaba-0336-4655-b2d9-9bd67261da54-kube-api-access-7wx84\") pod \"neutron-79589c5bbb-z9p5z\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.334532 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-844b6c58c7-qwqvv" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.509512 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b579d7-131a-4166-a67f-19294e5f652b" path="/var/lib/kubelet/pods/e8b579d7-131a-4166-a67f-19294e5f652b/volumes" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.609574 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-phplg" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.618227 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.648818 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k96hb" event={"ID":"b20682db-f5f4-4102-b0f9-662aeab1bd2a","Type":"ContainerStarted","Data":"e41c9011e8878fcf4a11e3a413f329408bd2c3dead8cb0c09dc6db523f0244ab"} Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.648854 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k96hb" event={"ID":"b20682db-f5f4-4102-b0f9-662aeab1bd2a","Type":"ContainerStarted","Data":"7c01b8685cd9ef00fa103abdc707256996d330405c05e1b8f72c7f20dc2e6f6e"} Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.650458 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a35812c5-ffc7-4307-ab31-390c9ee39262","Type":"ContainerStarted","Data":"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc"} Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.653833 4919 generic.go:334] "Generic (PLEG): container finished" podID="48f4d802-20e1-4c18-ae02-0b82adf97457" containerID="ee9009c08c5b8b648bc9784099a256de368a0592f078299d695ddce11f3c5604" exitCode=0 Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.653883 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-754b99d75-xsfht" event={"ID":"48f4d802-20e1-4c18-ae02-0b82adf97457","Type":"ContainerDied","Data":"ee9009c08c5b8b648bc9784099a256de368a0592f078299d695ddce11f3c5604"} Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.668732 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" event={"ID":"76635374-85f5-4577-866f-5f561c5223df","Type":"ContainerStarted","Data":"e1ce6268e221eb58f0ddb4abaf2f69797859a5b6d1868b20748fcb47388dabd7"} Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.672860 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"588d2da8-29e3-48dd-a0ca-7a04d72e2d96","Type":"ContainerStarted","Data":"772666ca763a7f2ee69c173ae3b7a7ca3e0cf73bd38abc45f9dc5db2b95fc1a7"} Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.674600 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q9dwb" event={"ID":"02d206af-330a-4526-8a3e-7826a1acb153","Type":"ContainerStarted","Data":"8115aff6a5ec07bd6d211cbd9c4a87d9e5ee80d167242486efee63b7a58a33d9"} Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.678937 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tbkwn" event={"ID":"15160303-4913-49a5-8cd3-e8255ba657f6","Type":"ContainerStarted","Data":"bb3bb3d24b551528222b912f6c8926a0f15b9dfebc9af7fbb028f9ffe7e9157b"} Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.702309 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-q9dwb" podStartSLOduration=3.808996173 podStartE2EDuration="26.702286547s" podCreationTimestamp="2026-03-10 22:11:47 +0000 UTC" firstStartedPulling="2026-03-10 22:11:49.10888215 +0000 UTC m=+1296.350762758" lastFinishedPulling="2026-03-10 22:12:12.002172514 +0000 UTC m=+1319.244053132" observedRunningTime="2026-03-10 22:12:13.700379055 +0000 UTC m=+1320.942259653" watchObservedRunningTime="2026-03-10 22:12:13.702286547 +0000 UTC m=+1320.944167155" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.734944 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k96hb" podStartSLOduration=14.734925571 podStartE2EDuration="14.734925571s" podCreationTimestamp="2026-03-10 22:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:13.731822647 +0000 UTC m=+1320.973703275" watchObservedRunningTime="2026-03-10 22:12:13.734925571 +0000 UTC m=+1320.976806199" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.785174 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tbkwn" podStartSLOduration=3.583444371 podStartE2EDuration="26.785156992s" podCreationTimestamp="2026-03-10 22:11:47 +0000 UTC" firstStartedPulling="2026-03-10 22:11:48.845592386 +0000 UTC m=+1296.087472994" lastFinishedPulling="2026-03-10 22:12:12.047305007 +0000 UTC m=+1319.289185615" observedRunningTime="2026-03-10 22:12:13.780428343 +0000 UTC m=+1321.022308961" watchObservedRunningTime="2026-03-10 22:12:13.785156992 +0000 UTC m=+1321.027037600" Mar 10 22:12:13 crc kubenswrapper[4919]: I0310 22:12:13.903047 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.079163 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-v77sd"] Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.180564 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.280215 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-nb\") pod \"48f4d802-20e1-4c18-ae02-0b82adf97457\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.280624 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-sb\") pod \"48f4d802-20e1-4c18-ae02-0b82adf97457\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.280727 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-svc\") pod \"48f4d802-20e1-4c18-ae02-0b82adf97457\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.280751 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-config\") pod \"48f4d802-20e1-4c18-ae02-0b82adf97457\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.280841 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-swift-storage-0\") pod \"48f4d802-20e1-4c18-ae02-0b82adf97457\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.280869 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs7k2\" (UniqueName: \"kubernetes.io/projected/48f4d802-20e1-4c18-ae02-0b82adf97457-kube-api-access-qs7k2\") pod \"48f4d802-20e1-4c18-ae02-0b82adf97457\" (UID: \"48f4d802-20e1-4c18-ae02-0b82adf97457\") " Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.301133 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f4d802-20e1-4c18-ae02-0b82adf97457-kube-api-access-qs7k2" (OuterVolumeSpecName: "kube-api-access-qs7k2") pod "48f4d802-20e1-4c18-ae02-0b82adf97457" (UID: "48f4d802-20e1-4c18-ae02-0b82adf97457"). InnerVolumeSpecName "kube-api-access-qs7k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.331920 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48f4d802-20e1-4c18-ae02-0b82adf97457" (UID: "48f4d802-20e1-4c18-ae02-0b82adf97457"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.365567 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "48f4d802-20e1-4c18-ae02-0b82adf97457" (UID: "48f4d802-20e1-4c18-ae02-0b82adf97457"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.384859 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.384894 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs7k2\" (UniqueName: \"kubernetes.io/projected/48f4d802-20e1-4c18-ae02-0b82adf97457-kube-api-access-qs7k2\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.384910 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.387808 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79589c5bbb-z9p5z"] Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.396351 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "48f4d802-20e1-4c18-ae02-0b82adf97457" (UID: "48f4d802-20e1-4c18-ae02-0b82adf97457"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.397203 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-config" (OuterVolumeSpecName: "config") pod "48f4d802-20e1-4c18-ae02-0b82adf97457" (UID: "48f4d802-20e1-4c18-ae02-0b82adf97457"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.408274 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "48f4d802-20e1-4c18-ae02-0b82adf97457" (UID: "48f4d802-20e1-4c18-ae02-0b82adf97457"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.488661 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.489023 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.489037 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f4d802-20e1-4c18-ae02-0b82adf97457-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.705130 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89ffa1-3600-4fff-965c-4b754167fed1","Type":"ContainerStarted","Data":"57025bd7e4039a91a22d0a6b4e525bdd19d365b1111edca5af54e5e8881b5929"} Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.711801 4919 generic.go:334] "Generic (PLEG): container finished" podID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" containerID="c48acd8ef511580ead42ad57aaab08a3dce73adeaff3b152fe955a3be9712daa" exitCode=0 Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.711892 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" event={"ID":"43b80b86-6652-4a1a-8be6-7a5643e0bb45","Type":"ContainerDied","Data":"c48acd8ef511580ead42ad57aaab08a3dce73adeaff3b152fe955a3be9712daa"} Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.711950 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" event={"ID":"43b80b86-6652-4a1a-8be6-7a5643e0bb45","Type":"ContainerStarted","Data":"ec6a91fad2a866d726aca47b0a0357c857e2d0a7473cc5914cbb437bd5306b6e"} Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.731285 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-754b99d75-xsfht" event={"ID":"48f4d802-20e1-4c18-ae02-0b82adf97457","Type":"ContainerDied","Data":"3a318003939a78b629a3311f943e30a06e1eed3705305c5e340d38141c3645bd"} Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.731337 4919 scope.go:117] "RemoveContainer" containerID="ee9009c08c5b8b648bc9784099a256de368a0592f078299d695ddce11f3c5604" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.731528 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-754b99d75-xsfht" Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.734537 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"588d2da8-29e3-48dd-a0ca-7a04d72e2d96","Type":"ContainerStarted","Data":"76bd780f6dc6fd78531401a4d82de65779263ac247bdf7a17e1815a0546c8bfd"} Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.739337 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79589c5bbb-z9p5z" event={"ID":"4fe9eaba-0336-4655-b2d9-9bd67261da54","Type":"ContainerStarted","Data":"52a09b1c3a7d8d81ab31e1af2550644d2817b5fffcc27dee15398dc75d21fb39"} Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.879954 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-xsfht"] Mar 10 22:12:14 crc kubenswrapper[4919]: I0310 22:12:14.889561 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-754b99d75-xsfht"] Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.511581 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f4d802-20e1-4c18-ae02-0b82adf97457" path="/var/lib/kubelet/pods/48f4d802-20e1-4c18-ae02-0b82adf97457/volumes" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.748515 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-59bc595969-d7r9w"] Mar 10 22:12:15 crc kubenswrapper[4919]: E0310 22:12:15.748953 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f4d802-20e1-4c18-ae02-0b82adf97457" containerName="init" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.748967 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f4d802-20e1-4c18-ae02-0b82adf97457" containerName="init" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.749147 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f4d802-20e1-4c18-ae02-0b82adf97457" containerName="init" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.751212 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.753813 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.757094 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.761154 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" event={"ID":"76635374-85f5-4577-866f-5f561c5223df","Type":"ContainerStarted","Data":"77dbfb1aefcdb836ebf686fec318f0978773a19f038987b02ed6672bc1025d92"} Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.779432 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59bc595969-d7r9w"] Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.785429 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"588d2da8-29e3-48dd-a0ca-7a04d72e2d96","Type":"ContainerStarted","Data":"be30389e0a07083126ec1e21b757a969028f95133bcb209764ca764d9f70d566"} Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.785616 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerName="glance-log" containerID="cri-o://76bd780f6dc6fd78531401a4d82de65779263ac247bdf7a17e1815a0546c8bfd" gracePeriod=30 Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.785716 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerName="glance-httpd" containerID="cri-o://be30389e0a07083126ec1e21b757a969028f95133bcb209764ca764d9f70d566" gracePeriod=30 Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.850705 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.850688045 podStartE2EDuration="24.850688045s" podCreationTimestamp="2026-03-10 22:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:15.843732236 +0000 UTC m=+1323.085612844" watchObservedRunningTime="2026-03-10 22:12:15.850688045 +0000 UTC m=+1323.092568643" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.857772 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-ovndb-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.857883 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-internal-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.857935 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-config\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.857954 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-622vg\" (UniqueName: \"kubernetes.io/projected/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-kube-api-access-622vg\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.857973 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-public-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.858006 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-combined-ca-bundle\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.858045 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-httpd-config\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.858059 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79589c5bbb-z9p5z" event={"ID":"4fe9eaba-0336-4655-b2d9-9bd67261da54","Type":"ContainerStarted","Data":"fc5f8917f23eea3338b4fb144ea30d5b228b7a0c8dc4c7d6918e3ba4cebe024f"} Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.858089 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79589c5bbb-z9p5z" event={"ID":"4fe9eaba-0336-4655-b2d9-9bd67261da54","Type":"ContainerStarted","Data":"c41bc0e2002aec40f136ba1e45796303ef87c5f297502cf12266931ccf978aff"} Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.859026 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.889223 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89ffa1-3600-4fff-965c-4b754167fed1","Type":"ContainerStarted","Data":"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387"} Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.903870 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" event={"ID":"43b80b86-6652-4a1a-8be6-7a5643e0bb45","Type":"ContainerStarted","Data":"1ded035287266e127df1c58824ea62be42b420eea53e61da848af410bacfa67d"} Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.910122 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" podStartSLOduration=14.435170203 podStartE2EDuration="15.910102334s" podCreationTimestamp="2026-03-10 22:12:00 +0000 UTC" firstStartedPulling="2026-03-10 22:12:13.166597603 +0000 UTC m=+1320.408478211" lastFinishedPulling="2026-03-10 22:12:14.641529724 +0000 UTC m=+1321.883410342" observedRunningTime="2026-03-10 22:12:15.885875908 +0000 UTC m=+1323.127756516" watchObservedRunningTime="2026-03-10 22:12:15.910102334 +0000 UTC m=+1323.151982942" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.912648 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.913725 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79589c5bbb-z9p5z" podStartSLOduration=3.913714953 podStartE2EDuration="3.913714953s" podCreationTimestamp="2026-03-10 22:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:15.911227905 +0000 UTC m=+1323.153108513" watchObservedRunningTime="2026-03-10 22:12:15.913714953 +0000 UTC m=+1323.155595561" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.944586 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" podStartSLOduration=3.944561578 podStartE2EDuration="3.944561578s" podCreationTimestamp="2026-03-10 22:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:15.929668144 +0000 UTC m=+1323.171548752" watchObservedRunningTime="2026-03-10 22:12:15.944561578 +0000 UTC m=+1323.186442186" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.959165 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-combined-ca-bundle\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.959286 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-httpd-config\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.959366 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-ovndb-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.959529 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-internal-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.959607 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-config\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.959633 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-622vg\" (UniqueName: \"kubernetes.io/projected/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-kube-api-access-622vg\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.959654 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-public-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.965009 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-combined-ca-bundle\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.965171 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-httpd-config\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.966560 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-ovndb-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.967228 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-public-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.968638 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-config\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.972127 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-internal-tls-certs\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:15 crc kubenswrapper[4919]: I0310 22:12:15.983821 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-622vg\" (UniqueName: \"kubernetes.io/projected/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-kube-api-access-622vg\") pod \"neutron-59bc595969-d7r9w\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.111968 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.919097 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89ffa1-3600-4fff-965c-4b754167fed1","Type":"ContainerStarted","Data":"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de"} Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.919307 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerName="glance-log" containerID="cri-o://81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387" gracePeriod=30 Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.919943 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerName="glance-httpd" containerID="cri-o://892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de" gracePeriod=30 Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.924976 4919 generic.go:334] "Generic (PLEG): container finished" podID="76635374-85f5-4577-866f-5f561c5223df" containerID="77dbfb1aefcdb836ebf686fec318f0978773a19f038987b02ed6672bc1025d92" exitCode=0 Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.925463 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" event={"ID":"76635374-85f5-4577-866f-5f561c5223df","Type":"ContainerDied","Data":"77dbfb1aefcdb836ebf686fec318f0978773a19f038987b02ed6672bc1025d92"} Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.934701 4919 generic.go:334] "Generic (PLEG): container finished" podID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerID="be30389e0a07083126ec1e21b757a969028f95133bcb209764ca764d9f70d566" exitCode=0 Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.934731 4919 generic.go:334] "Generic (PLEG): container finished" podID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerID="76bd780f6dc6fd78531401a4d82de65779263ac247bdf7a17e1815a0546c8bfd" exitCode=143 Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.937807 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"588d2da8-29e3-48dd-a0ca-7a04d72e2d96","Type":"ContainerDied","Data":"be30389e0a07083126ec1e21b757a969028f95133bcb209764ca764d9f70d566"} Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.937890 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"588d2da8-29e3-48dd-a0ca-7a04d72e2d96","Type":"ContainerDied","Data":"76bd780f6dc6fd78531401a4d82de65779263ac247bdf7a17e1815a0546c8bfd"} Mar 10 22:12:16 crc kubenswrapper[4919]: I0310 22:12:16.953081 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=25.953059242 podStartE2EDuration="25.953059242s" podCreationTimestamp="2026-03-10 22:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:16.945873977 +0000 UTC m=+1324.187754585" watchObservedRunningTime="2026-03-10 22:12:16.953059242 +0000 UTC m=+1324.194939850" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.174748 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59bc595969-d7r9w"] Mar 10 22:12:17 crc kubenswrapper[4919]: W0310 22:12:17.197771 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4216f40_ccfe_4c2e_8bd7_944a4413bc43.slice/crio-3a1810b5fc027a3dcb5f48d47380c1138bdf296bfc9b70fd8d05ce2643ba8249 WatchSource:0}: Error finding container 3a1810b5fc027a3dcb5f48d47380c1138bdf296bfc9b70fd8d05ce2643ba8249: Status 404 returned error can't find the container with id 3a1810b5fc027a3dcb5f48d47380c1138bdf296bfc9b70fd8d05ce2643ba8249 Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.296201 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.393063 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-httpd-run\") pod \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.393104 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.393174 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-config-data\") pod \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.393206 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-combined-ca-bundle\") pod \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.393228 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-scripts\") pod \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.393362 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-logs\") pod \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.393481 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7sqk\" (UniqueName: \"kubernetes.io/projected/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-kube-api-access-h7sqk\") pod \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\" (UID: \"588d2da8-29e3-48dd-a0ca-7a04d72e2d96\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.394464 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-logs" (OuterVolumeSpecName: "logs") pod "588d2da8-29e3-48dd-a0ca-7a04d72e2d96" (UID: "588d2da8-29e3-48dd-a0ca-7a04d72e2d96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.398575 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "588d2da8-29e3-48dd-a0ca-7a04d72e2d96" (UID: "588d2da8-29e3-48dd-a0ca-7a04d72e2d96"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.400280 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-kube-api-access-h7sqk" (OuterVolumeSpecName: "kube-api-access-h7sqk") pod "588d2da8-29e3-48dd-a0ca-7a04d72e2d96" (UID: "588d2da8-29e3-48dd-a0ca-7a04d72e2d96"). InnerVolumeSpecName "kube-api-access-h7sqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.400328 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-scripts" (OuterVolumeSpecName: "scripts") pod "588d2da8-29e3-48dd-a0ca-7a04d72e2d96" (UID: "588d2da8-29e3-48dd-a0ca-7a04d72e2d96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.402940 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "588d2da8-29e3-48dd-a0ca-7a04d72e2d96" (UID: "588d2da8-29e3-48dd-a0ca-7a04d72e2d96"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.441917 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "588d2da8-29e3-48dd-a0ca-7a04d72e2d96" (UID: "588d2da8-29e3-48dd-a0ca-7a04d72e2d96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.474886 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-config-data" (OuterVolumeSpecName: "config-data") pod "588d2da8-29e3-48dd-a0ca-7a04d72e2d96" (UID: "588d2da8-29e3-48dd-a0ca-7a04d72e2d96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.495659 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.495687 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.495697 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.495706 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.495716 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7sqk\" (UniqueName: \"kubernetes.io/projected/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-kube-api-access-h7sqk\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.495736 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/588d2da8-29e3-48dd-a0ca-7a04d72e2d96-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.495760 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.535610 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.600253 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.824741 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905289 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-httpd-run\") pod \"5f89ffa1-3600-4fff-965c-4b754167fed1\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905330 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-logs\") pod \"5f89ffa1-3600-4fff-965c-4b754167fed1\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905438 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5f89ffa1-3600-4fff-965c-4b754167fed1\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905468 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-scripts\") pod \"5f89ffa1-3600-4fff-965c-4b754167fed1\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905539 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsfkp\" (UniqueName: \"kubernetes.io/projected/5f89ffa1-3600-4fff-965c-4b754167fed1-kube-api-access-qsfkp\") pod \"5f89ffa1-3600-4fff-965c-4b754167fed1\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905559 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-combined-ca-bundle\") pod \"5f89ffa1-3600-4fff-965c-4b754167fed1\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905589 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-config-data\") pod \"5f89ffa1-3600-4fff-965c-4b754167fed1\" (UID: \"5f89ffa1-3600-4fff-965c-4b754167fed1\") " Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905887 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5f89ffa1-3600-4fff-965c-4b754167fed1" (UID: "5f89ffa1-3600-4fff-965c-4b754167fed1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.905968 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-logs" (OuterVolumeSpecName: "logs") pod "5f89ffa1-3600-4fff-965c-4b754167fed1" (UID: "5f89ffa1-3600-4fff-965c-4b754167fed1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.909607 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5f89ffa1-3600-4fff-965c-4b754167fed1" (UID: "5f89ffa1-3600-4fff-965c-4b754167fed1"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.910165 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-scripts" (OuterVolumeSpecName: "scripts") pod "5f89ffa1-3600-4fff-965c-4b754167fed1" (UID: "5f89ffa1-3600-4fff-965c-4b754167fed1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.930262 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f89ffa1-3600-4fff-965c-4b754167fed1-kube-api-access-qsfkp" (OuterVolumeSpecName: "kube-api-access-qsfkp") pod "5f89ffa1-3600-4fff-965c-4b754167fed1" (UID: "5f89ffa1-3600-4fff-965c-4b754167fed1"). InnerVolumeSpecName "kube-api-access-qsfkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.943880 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f89ffa1-3600-4fff-965c-4b754167fed1" (UID: "5f89ffa1-3600-4fff-965c-4b754167fed1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.976099 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"588d2da8-29e3-48dd-a0ca-7a04d72e2d96","Type":"ContainerDied","Data":"772666ca763a7f2ee69c173ae3b7a7ca3e0cf73bd38abc45f9dc5db2b95fc1a7"} Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.976164 4919 scope.go:117] "RemoveContainer" containerID="be30389e0a07083126ec1e21b757a969028f95133bcb209764ca764d9f70d566" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.976322 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.979184 4919 generic.go:334] "Generic (PLEG): container finished" podID="02d206af-330a-4526-8a3e-7826a1acb153" containerID="8115aff6a5ec07bd6d211cbd9c4a87d9e5ee80d167242486efee63b7a58a33d9" exitCode=0 Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.979261 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q9dwb" event={"ID":"02d206af-330a-4526-8a3e-7826a1acb153","Type":"ContainerDied","Data":"8115aff6a5ec07bd6d211cbd9c4a87d9e5ee80d167242486efee63b7a58a33d9"} Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.982717 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-config-data" (OuterVolumeSpecName: "config-data") pod "5f89ffa1-3600-4fff-965c-4b754167fed1" (UID: "5f89ffa1-3600-4fff-965c-4b754167fed1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.983553 4919 generic.go:334] "Generic (PLEG): container finished" podID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerID="892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de" exitCode=0 Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.983579 4919 generic.go:334] "Generic (PLEG): container finished" podID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerID="81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387" exitCode=143 Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.983631 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89ffa1-3600-4fff-965c-4b754167fed1","Type":"ContainerDied","Data":"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de"} Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.983653 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89ffa1-3600-4fff-965c-4b754167fed1","Type":"ContainerDied","Data":"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387"} Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.983664 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89ffa1-3600-4fff-965c-4b754167fed1","Type":"ContainerDied","Data":"57025bd7e4039a91a22d0a6b4e525bdd19d365b1111edca5af54e5e8881b5929"} Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.983730 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.995648 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59bc595969-d7r9w" event={"ID":"d4216f40-ccfe-4c2e-8bd7-944a4413bc43","Type":"ContainerStarted","Data":"2888487e5f0936f95c545464ab474073718039ae8ed2a613fd2c6a1ac23b7e58"} Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.995693 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59bc595969-d7r9w" event={"ID":"d4216f40-ccfe-4c2e-8bd7-944a4413bc43","Type":"ContainerStarted","Data":"157348cf321f1547509c6d04d11d59fea346018cf02810116f3231b189156e80"} Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.995704 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59bc595969-d7r9w" event={"ID":"d4216f40-ccfe-4c2e-8bd7-944a4413bc43","Type":"ContainerStarted","Data":"3a1810b5fc027a3dcb5f48d47380c1138bdf296bfc9b70fd8d05ce2643ba8249"} Mar 10 22:12:17 crc kubenswrapper[4919]: I0310 22:12:17.995757 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.000931 4919 generic.go:334] "Generic (PLEG): container finished" podID="b20682db-f5f4-4102-b0f9-662aeab1bd2a" containerID="e41c9011e8878fcf4a11e3a413f329408bd2c3dead8cb0c09dc6db523f0244ab" exitCode=0 Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.001091 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k96hb" event={"ID":"b20682db-f5f4-4102-b0f9-662aeab1bd2a","Type":"ContainerDied","Data":"e41c9011e8878fcf4a11e3a413f329408bd2c3dead8cb0c09dc6db523f0244ab"} Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.007205 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsfkp\" (UniqueName: \"kubernetes.io/projected/5f89ffa1-3600-4fff-965c-4b754167fed1-kube-api-access-qsfkp\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.007250 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.007264 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.007275 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.007289 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f89ffa1-3600-4fff-965c-4b754167fed1-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.007322 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.007333 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89ffa1-3600-4fff-965c-4b754167fed1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.026526 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.055620 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-59bc595969-d7r9w" podStartSLOduration=3.055601554 podStartE2EDuration="3.055601554s" podCreationTimestamp="2026-03-10 22:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:18.049227381 +0000 UTC m=+1325.291108009" watchObservedRunningTime="2026-03-10 22:12:18.055601554 +0000 UTC m=+1325.297482162" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.082721 4919 scope.go:117] "RemoveContainer" containerID="76bd780f6dc6fd78531401a4d82de65779263ac247bdf7a17e1815a0546c8bfd" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.106075 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.114802 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.133882 4919 scope.go:117] "RemoveContainer" containerID="892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.145507 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.159622 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.178300 4919 scope.go:117] "RemoveContainer" containerID="81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.178479 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.200463 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:12:18 crc kubenswrapper[4919]: E0310 22:12:18.201060 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerName="glance-log" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.201078 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerName="glance-log" Mar 10 22:12:18 crc kubenswrapper[4919]: E0310 22:12:18.201105 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerName="glance-log" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.201112 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerName="glance-log" Mar 10 22:12:18 crc kubenswrapper[4919]: E0310 22:12:18.201139 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerName="glance-httpd" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.201148 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerName="glance-httpd" Mar 10 22:12:18 crc kubenswrapper[4919]: E0310 22:12:18.201177 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerName="glance-httpd" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.201183 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerName="glance-httpd" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.201505 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerName="glance-log" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.201551 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" containerName="glance-httpd" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.201565 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerName="glance-httpd" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.201592 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" containerName="glance-log" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.203013 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.207200 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xjqfl" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.207455 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.207655 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.207815 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.215279 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.234050 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.234145 4919 scope.go:117] "RemoveContainer" containerID="892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.259004 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 22:12:18 crc kubenswrapper[4919]: E0310 22:12:18.259095 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de\": container with ID starting with 892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de not found: ID does not exist" containerID="892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.259145 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de"} err="failed to get container status \"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de\": rpc error: code = NotFound desc = could not find container \"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de\": container with ID starting with 892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de not found: ID does not exist" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.259187 4919 scope.go:117] "RemoveContainer" containerID="81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.259174 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 22:12:18 crc kubenswrapper[4919]: E0310 22:12:18.259985 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387\": container with ID starting with 81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387 not found: ID does not exist" containerID="81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.260019 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387"} err="failed to get container status \"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387\": rpc error: code = NotFound desc = could not find container \"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387\": container with ID starting with 81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387 not found: ID does not exist" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.260037 4919 scope.go:117] "RemoveContainer" containerID="892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.261244 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de"} err="failed to get container status \"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de\": rpc error: code = NotFound desc = could not find container \"892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de\": container with ID starting with 892070d8ec4ce9779a33eb958144d6ebea2a2cc7cff88761e4a733ddc15283de not found: ID does not exist" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.261268 4919 scope.go:117] "RemoveContainer" containerID="81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.261507 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387"} err="failed to get container status \"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387\": rpc error: code = NotFound desc = could not find container \"81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387\": container with ID starting with 81293fa87d7c57bcd80aa368e3c25f4e1e098dcbe3b2a497640f256ff1e89387 not found: ID does not exist" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.264039 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.273861 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.324552 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.324751 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg8ck\" (UniqueName: \"kubernetes.io/projected/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-kube-api-access-pg8ck\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.324918 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325027 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325119 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325190 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325271 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325330 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-scripts\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325374 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325673 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-config-data\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325739 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-logs\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325787 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325848 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qhgn\" (UniqueName: \"kubernetes.io/projected/7050d40c-b959-48a8-b21f-b9f5e308c920-kube-api-access-2qhgn\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.325873 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-logs\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.328717 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.328784 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430471 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430508 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430531 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-scripts\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430556 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430597 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-config-data\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430617 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-logs\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430642 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430661 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qhgn\" (UniqueName: \"kubernetes.io/projected/7050d40c-b959-48a8-b21f-b9f5e308c920-kube-api-access-2qhgn\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430676 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-logs\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430712 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430728 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430755 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430780 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg8ck\" (UniqueName: \"kubernetes.io/projected/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-kube-api-access-pg8ck\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.430914 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.431181 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.431215 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.431235 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.431343 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.432618 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-logs\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.433709 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-logs\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.433754 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.433759 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.443732 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.444244 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-scripts\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.444512 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.446692 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-config-data\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.447734 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.448004 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.452458 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qhgn\" (UniqueName: \"kubernetes.io/projected/7050d40c-b959-48a8-b21f-b9f5e308c920-kube-api-access-2qhgn\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.452694 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.454897 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.455312 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg8ck\" (UniqueName: \"kubernetes.io/projected/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-kube-api-access-pg8ck\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.471162 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.470099 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.535040 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:12:18 crc kubenswrapper[4919]: I0310 22:12:18.588489 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:19 crc kubenswrapper[4919]: I0310 22:12:19.027415 4919 generic.go:334] "Generic (PLEG): container finished" podID="15160303-4913-49a5-8cd3-e8255ba657f6" containerID="bb3bb3d24b551528222b912f6c8926a0f15b9dfebc9af7fbb028f9ffe7e9157b" exitCode=0 Mar 10 22:12:19 crc kubenswrapper[4919]: I0310 22:12:19.027624 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tbkwn" event={"ID":"15160303-4913-49a5-8cd3-e8255ba657f6","Type":"ContainerDied","Data":"bb3bb3d24b551528222b912f6c8926a0f15b9dfebc9af7fbb028f9ffe7e9157b"} Mar 10 22:12:19 crc kubenswrapper[4919]: I0310 22:12:19.494941 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588d2da8-29e3-48dd-a0ca-7a04d72e2d96" path="/var/lib/kubelet/pods/588d2da8-29e3-48dd-a0ca-7a04d72e2d96/volumes" Mar 10 22:12:19 crc kubenswrapper[4919]: I0310 22:12:19.496511 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f89ffa1-3600-4fff-965c-4b754167fed1" path="/var/lib/kubelet/pods/5f89ffa1-3600-4fff-965c-4b754167fed1/volumes" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.904419 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q9dwb" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.945756 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.957696 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.967193 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.973661 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-credential-keys\") pod \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974703 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-combined-ca-bundle\") pod \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974748 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-config-data\") pod \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974773 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-combined-ca-bundle\") pod \"02d206af-330a-4526-8a3e-7826a1acb153\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974793 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncn7m\" (UniqueName: \"kubernetes.io/projected/b20682db-f5f4-4102-b0f9-662aeab1bd2a-kube-api-access-ncn7m\") pod \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974846 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-fernet-keys\") pod \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974889 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02d206af-330a-4526-8a3e-7826a1acb153-logs\") pod \"02d206af-330a-4526-8a3e-7826a1acb153\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974915 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-db-sync-config-data\") pod \"15160303-4913-49a5-8cd3-e8255ba657f6\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974932 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-combined-ca-bundle\") pod \"15160303-4913-49a5-8cd3-e8255ba657f6\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974947 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qhqf\" (UniqueName: \"kubernetes.io/projected/02d206af-330a-4526-8a3e-7826a1acb153-kube-api-access-6qhqf\") pod \"02d206af-330a-4526-8a3e-7826a1acb153\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974980 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-scripts\") pod \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\" (UID: \"b20682db-f5f4-4102-b0f9-662aeab1bd2a\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.974999 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-scripts\") pod \"02d206af-330a-4526-8a3e-7826a1acb153\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.975021 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwvfm\" (UniqueName: \"kubernetes.io/projected/15160303-4913-49a5-8cd3-e8255ba657f6-kube-api-access-wwvfm\") pod \"15160303-4913-49a5-8cd3-e8255ba657f6\" (UID: \"15160303-4913-49a5-8cd3-e8255ba657f6\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.975054 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnq8k\" (UniqueName: \"kubernetes.io/projected/76635374-85f5-4577-866f-5f561c5223df-kube-api-access-nnq8k\") pod \"76635374-85f5-4577-866f-5f561c5223df\" (UID: \"76635374-85f5-4577-866f-5f561c5223df\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.975088 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-config-data\") pod \"02d206af-330a-4526-8a3e-7826a1acb153\" (UID: \"02d206af-330a-4526-8a3e-7826a1acb153\") " Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.975376 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d206af-330a-4526-8a3e-7826a1acb153-logs" (OuterVolumeSpecName: "logs") pod "02d206af-330a-4526-8a3e-7826a1acb153" (UID: "02d206af-330a-4526-8a3e-7826a1acb153"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.975801 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02d206af-330a-4526-8a3e-7826a1acb153-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.984729 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15160303-4913-49a5-8cd3-e8255ba657f6-kube-api-access-wwvfm" (OuterVolumeSpecName: "kube-api-access-wwvfm") pod "15160303-4913-49a5-8cd3-e8255ba657f6" (UID: "15160303-4913-49a5-8cd3-e8255ba657f6"). InnerVolumeSpecName "kube-api-access-wwvfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.985784 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-scripts" (OuterVolumeSpecName: "scripts") pod "b20682db-f5f4-4102-b0f9-662aeab1bd2a" (UID: "b20682db-f5f4-4102-b0f9-662aeab1bd2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.986021 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76635374-85f5-4577-866f-5f561c5223df-kube-api-access-nnq8k" (OuterVolumeSpecName: "kube-api-access-nnq8k") pod "76635374-85f5-4577-866f-5f561c5223df" (UID: "76635374-85f5-4577-866f-5f561c5223df"). InnerVolumeSpecName "kube-api-access-nnq8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.987067 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b20682db-f5f4-4102-b0f9-662aeab1bd2a" (UID: "b20682db-f5f4-4102-b0f9-662aeab1bd2a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.993440 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20682db-f5f4-4102-b0f9-662aeab1bd2a-kube-api-access-ncn7m" (OuterVolumeSpecName: "kube-api-access-ncn7m") pod "b20682db-f5f4-4102-b0f9-662aeab1bd2a" (UID: "b20682db-f5f4-4102-b0f9-662aeab1bd2a"). InnerVolumeSpecName "kube-api-access-ncn7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.993512 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-scripts" (OuterVolumeSpecName: "scripts") pod "02d206af-330a-4526-8a3e-7826a1acb153" (UID: "02d206af-330a-4526-8a3e-7826a1acb153"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.994281 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "15160303-4913-49a5-8cd3-e8255ba657f6" (UID: "15160303-4913-49a5-8cd3-e8255ba657f6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.995174 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d206af-330a-4526-8a3e-7826a1acb153-kube-api-access-6qhqf" (OuterVolumeSpecName: "kube-api-access-6qhqf") pod "02d206af-330a-4526-8a3e-7826a1acb153" (UID: "02d206af-330a-4526-8a3e-7826a1acb153"). InnerVolumeSpecName "kube-api-access-6qhqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:20 crc kubenswrapper[4919]: I0310 22:12:20.995512 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b20682db-f5f4-4102-b0f9-662aeab1bd2a" (UID: "b20682db-f5f4-4102-b0f9-662aeab1bd2a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.036503 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15160303-4913-49a5-8cd3-e8255ba657f6" (UID: "15160303-4913-49a5-8cd3-e8255ba657f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.047543 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b20682db-f5f4-4102-b0f9-662aeab1bd2a" (UID: "b20682db-f5f4-4102-b0f9-662aeab1bd2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.048204 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" event={"ID":"76635374-85f5-4577-866f-5f561c5223df","Type":"ContainerDied","Data":"e1ce6268e221eb58f0ddb4abaf2f69797859a5b6d1868b20748fcb47388dabd7"} Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.048243 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ce6268e221eb58f0ddb4abaf2f69797859a5b6d1868b20748fcb47388dabd7" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.048306 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553012-5ftd6" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.054922 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-config-data" (OuterVolumeSpecName: "config-data") pod "02d206af-330a-4526-8a3e-7826a1acb153" (UID: "02d206af-330a-4526-8a3e-7826a1acb153"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.055487 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q9dwb" event={"ID":"02d206af-330a-4526-8a3e-7826a1acb153","Type":"ContainerDied","Data":"eb052bc847f69c3de26e1e830915dc4812075b65e200888bd4dbf9d33ac585b3"} Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.055524 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb052bc847f69c3de26e1e830915dc4812075b65e200888bd4dbf9d33ac585b3" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.055583 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q9dwb" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.057825 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tbkwn" event={"ID":"15160303-4913-49a5-8cd3-e8255ba657f6","Type":"ContainerDied","Data":"bd4430a632a18966644e9e9420a2112f3db08d4877a829c963e20f4026763cbe"} Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.057855 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd4430a632a18966644e9e9420a2112f3db08d4877a829c963e20f4026763cbe" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.057899 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tbkwn" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.066656 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k96hb" event={"ID":"b20682db-f5f4-4102-b0f9-662aeab1bd2a","Type":"ContainerDied","Data":"7c01b8685cd9ef00fa103abdc707256996d330405c05e1b8f72c7f20dc2e6f6e"} Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.066692 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c01b8685cd9ef00fa103abdc707256996d330405c05e1b8f72c7f20dc2e6f6e" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.066760 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k96hb" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077048 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077072 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncn7m\" (UniqueName: \"kubernetes.io/projected/b20682db-f5f4-4102-b0f9-662aeab1bd2a-kube-api-access-ncn7m\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077086 4919 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077097 4919 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077134 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15160303-4913-49a5-8cd3-e8255ba657f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077149 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qhqf\" (UniqueName: \"kubernetes.io/projected/02d206af-330a-4526-8a3e-7826a1acb153-kube-api-access-6qhqf\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077160 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077172 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077183 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwvfm\" (UniqueName: \"kubernetes.io/projected/15160303-4913-49a5-8cd3-e8255ba657f6-kube-api-access-wwvfm\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077194 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnq8k\" (UniqueName: \"kubernetes.io/projected/76635374-85f5-4577-866f-5f561c5223df-kube-api-access-nnq8k\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077204 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.077215 4919 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.085608 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-config-data" (OuterVolumeSpecName: "config-data") pod "b20682db-f5f4-4102-b0f9-662aeab1bd2a" (UID: "b20682db-f5f4-4102-b0f9-662aeab1bd2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.116833 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02d206af-330a-4526-8a3e-7826a1acb153" (UID: "02d206af-330a-4526-8a3e-7826a1acb153"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.178680 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20682db-f5f4-4102-b0f9-662aeab1bd2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.178720 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d206af-330a-4526-8a3e-7826a1acb153-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347039 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp"] Mar 10 22:12:21 crc kubenswrapper[4919]: E0310 22:12:21.347441 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d206af-330a-4526-8a3e-7826a1acb153" containerName="placement-db-sync" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347463 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d206af-330a-4526-8a3e-7826a1acb153" containerName="placement-db-sync" Mar 10 22:12:21 crc kubenswrapper[4919]: E0310 22:12:21.347490 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20682db-f5f4-4102-b0f9-662aeab1bd2a" containerName="keystone-bootstrap" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347498 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20682db-f5f4-4102-b0f9-662aeab1bd2a" containerName="keystone-bootstrap" Mar 10 22:12:21 crc kubenswrapper[4919]: E0310 22:12:21.347518 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15160303-4913-49a5-8cd3-e8255ba657f6" containerName="barbican-db-sync" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347529 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="15160303-4913-49a5-8cd3-e8255ba657f6" containerName="barbican-db-sync" Mar 10 22:12:21 crc kubenswrapper[4919]: E0310 22:12:21.347541 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76635374-85f5-4577-866f-5f561c5223df" containerName="oc" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347551 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="76635374-85f5-4577-866f-5f561c5223df" containerName="oc" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347778 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20682db-f5f4-4102-b0f9-662aeab1bd2a" containerName="keystone-bootstrap" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347810 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="76635374-85f5-4577-866f-5f561c5223df" containerName="oc" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347831 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="15160303-4913-49a5-8cd3-e8255ba657f6" containerName="barbican-db-sync" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.347849 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d206af-330a-4526-8a3e-7826a1acb153" containerName="placement-db-sync" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.349030 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.357906 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.358122 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jntrm" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.358624 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.396488 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.429534 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-84fbf8d4df-qnkcp"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.432559 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.439655 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.443513 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84fbf8d4df-qnkcp"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.487225 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data-custom\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.487266 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwbz\" (UniqueName: \"kubernetes.io/projected/98b65b94-9415-4a0b-acb7-760a536d250a-kube-api-access-qbwbz\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.487969 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-combined-ca-bundle\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.488002 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.488038 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b65b94-9415-4a0b-acb7-760a536d250a-logs\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.535147 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-v77sd"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.535459 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" podUID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" containerName="dnsmasq-dns" containerID="cri-o://1ded035287266e127df1c58824ea62be42b420eea53e61da848af410bacfa67d" gracePeriod=10 Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.541555 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.567240 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.584338 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98cfc95fc-fjth4"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.587181 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.592857 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data-custom\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.592897 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwbz\" (UniqueName: \"kubernetes.io/projected/98b65b94-9415-4a0b-acb7-760a536d250a-kube-api-access-qbwbz\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.592952 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-combined-ca-bundle\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.593008 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data-custom\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.593037 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-combined-ca-bundle\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.605735 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.605793 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-logs\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.612870 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.612920 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b65b94-9415-4a0b-acb7-760a536d250a-logs\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.612945 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdrz\" (UniqueName: \"kubernetes.io/projected/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-kube-api-access-wjdrz\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.613204 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-combined-ca-bundle\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.613277 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b65b94-9415-4a0b-acb7-760a536d250a-logs\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.616632 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data-custom\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.631585 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.632022 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98cfc95fc-fjth4"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.645729 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwbz\" (UniqueName: \"kubernetes.io/projected/98b65b94-9415-4a0b-acb7-760a536d250a-kube-api-access-qbwbz\") pod \"barbican-keystone-listener-7d5f6c4bb6-46xtp\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.660288 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5bdd68856d-p865v"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.664918 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.669569 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.698049 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bdd68856d-p865v"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.711361 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.711479 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.716638 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-sb\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.716691 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-swift-storage-0\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.716785 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-combined-ca-bundle\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.716832 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-config\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.716870 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data-custom\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.716923 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-svc\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.716960 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-logs\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.716988 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-nb\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.717044 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp8gq\" (UniqueName: \"kubernetes.io/projected/0c6e8e16-fa38-44f1-8a47-c6130972b034-kube-api-access-kp8gq\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.717069 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.717107 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdrz\" (UniqueName: \"kubernetes.io/projected/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-kube-api-access-wjdrz\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.718302 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-logs\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.732765 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.743156 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-combined-ca-bundle\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.758673 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data-custom\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.763732 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdrz\" (UniqueName: \"kubernetes.io/projected/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-kube-api-access-wjdrz\") pod \"barbican-worker-84fbf8d4df-qnkcp\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.784953 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821146 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821202 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp8gq\" (UniqueName: \"kubernetes.io/projected/0c6e8e16-fa38-44f1-8a47-c6130972b034-kube-api-access-kp8gq\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821261 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djwq4\" (UniqueName: \"kubernetes.io/projected/1372c543-234e-41a0-936c-d17cdf422557-kube-api-access-djwq4\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821339 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-combined-ca-bundle\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821406 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data-custom\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821438 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-sb\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821459 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-swift-storage-0\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821485 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372c543-234e-41a0-936c-d17cdf422557-logs\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821513 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-config\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821547 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-svc\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.821569 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-nb\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.822382 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-nb\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.823327 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-sb\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.823948 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-config\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.824146 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-swift-storage-0\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.824464 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-svc\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.903785 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp8gq\" (UniqueName: \"kubernetes.io/projected/0c6e8e16-fa38-44f1-8a47-c6130972b034-kube-api-access-kp8gq\") pod \"dnsmasq-dns-98cfc95fc-fjth4\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.929704 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data-custom\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.929911 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372c543-234e-41a0-936c-d17cdf422557-logs\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.930094 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.930157 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djwq4\" (UniqueName: \"kubernetes.io/projected/1372c543-234e-41a0-936c-d17cdf422557-kube-api-access-djwq4\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.930213 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-combined-ca-bundle\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.937863 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372c543-234e-41a0-936c-d17cdf422557-logs\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.950621 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.951033 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.993497 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-combined-ca-bundle\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:21 crc kubenswrapper[4919]: I0310 22:12:21.996827 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djwq4\" (UniqueName: \"kubernetes.io/projected/1372c543-234e-41a0-936c-d17cdf422557-kube-api-access-djwq4\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.001582 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data-custom\") pod \"barbican-api-5bdd68856d-p865v\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.134170 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a35812c5-ffc7-4307-ab31-390c9ee39262","Type":"ContainerStarted","Data":"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec"} Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.137054 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7050d40c-b959-48a8-b21f-b9f5e308c920","Type":"ContainerStarted","Data":"7afed62878906dabbbce9d8f9449b8bf93815c65c8d53a0cece7e61de2b5794b"} Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.162824 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553006-kvjk2"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.170666 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09d4bc6e-4f9e-4375-a816-2aad9cf376b2","Type":"ContainerStarted","Data":"d0d0d91e4f048967e856ec9ab78e4ae0091d82ddfdeceab13d97dd1e341fe97b"} Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.178594 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553006-kvjk2"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.191886 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-599f4d795-pgnpd"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.193186 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.201907 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.202086 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.202194 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.202411 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-599f4d795-pgnpd"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.202850 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.202924 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nbmmg" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.220272 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.229439 4919 generic.go:334] "Generic (PLEG): container finished" podID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" containerID="1ded035287266e127df1c58824ea62be42b420eea53e61da848af410bacfa67d" exitCode=0 Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.229755 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" event={"ID":"43b80b86-6652-4a1a-8be6-7a5643e0bb45","Type":"ContainerDied","Data":"1ded035287266e127df1c58824ea62be42b420eea53e61da848af410bacfa67d"} Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.286796 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.300605 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76f79c8d94-r2gfk"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.302061 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.304569 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.304765 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.304931 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.305107 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.305286 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f2pl6" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.354814 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-public-tls-certs\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.354951 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-scripts\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.355018 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4p55\" (UniqueName: \"kubernetes.io/projected/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-kube-api-access-g4p55\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.355059 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-internal-tls-certs\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.355101 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-fernet-keys\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.355176 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-config-data\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.355215 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-combined-ca-bundle\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.355236 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-credential-keys\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.357357 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76f79c8d94-r2gfk"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.378864 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456470 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r9zh\" (UniqueName: \"kubernetes.io/projected/43b80b86-6652-4a1a-8be6-7a5643e0bb45-kube-api-access-7r9zh\") pod \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456518 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-nb\") pod \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456576 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-config\") pod \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456591 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-swift-storage-0\") pod \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456621 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-svc\") pod \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456648 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-sb\") pod \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\" (UID: \"43b80b86-6652-4a1a-8be6-7a5643e0bb45\") " Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456864 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-credential-keys\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456897 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-public-tls-certs\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456943 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-combined-ca-bundle\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456969 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-public-tls-certs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.456993 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-scripts\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457030 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-internal-tls-certs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457048 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4p55\" (UniqueName: \"kubernetes.io/projected/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-kube-api-access-g4p55\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457069 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38fe889d-dbc7-448a-ade6-7847b16f85d2-logs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457096 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-internal-tls-certs\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457124 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-config-data\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457142 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-fernet-keys\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457172 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-scripts\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457208 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-config-data\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457230 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx85f\" (UniqueName: \"kubernetes.io/projected/38fe889d-dbc7-448a-ade6-7847b16f85d2-kube-api-access-sx85f\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.457253 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-combined-ca-bundle\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.463113 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-combined-ca-bundle\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.468039 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-scripts\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.468170 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b80b86-6652-4a1a-8be6-7a5643e0bb45-kube-api-access-7r9zh" (OuterVolumeSpecName: "kube-api-access-7r9zh") pod "43b80b86-6652-4a1a-8be6-7a5643e0bb45" (UID: "43b80b86-6652-4a1a-8be6-7a5643e0bb45"). InnerVolumeSpecName "kube-api-access-7r9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.469824 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-internal-tls-certs\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.478055 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-credential-keys\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.497199 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-fernet-keys\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.503062 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-config-data\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.503188 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-public-tls-certs\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.552018 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4p55\" (UniqueName: \"kubernetes.io/projected/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-kube-api-access-g4p55\") pod \"keystone-599f4d795-pgnpd\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.565587 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-scripts\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.565725 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx85f\" (UniqueName: \"kubernetes.io/projected/38fe889d-dbc7-448a-ade6-7847b16f85d2-kube-api-access-sx85f\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.565875 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-combined-ca-bundle\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.565914 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-public-tls-certs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.566028 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-internal-tls-certs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.566055 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38fe889d-dbc7-448a-ade6-7847b16f85d2-logs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.566128 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-config-data\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.566202 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r9zh\" (UniqueName: \"kubernetes.io/projected/43b80b86-6652-4a1a-8be6-7a5643e0bb45-kube-api-access-7r9zh\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.578951 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38fe889d-dbc7-448a-ade6-7847b16f85d2-logs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.607451 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-854d8d6bf4-kknjq"] Mar 10 22:12:22 crc kubenswrapper[4919]: E0310 22:12:22.614858 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" containerName="init" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.614907 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" containerName="init" Mar 10 22:12:22 crc kubenswrapper[4919]: E0310 22:12:22.614939 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" containerName="dnsmasq-dns" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.614947 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" containerName="dnsmasq-dns" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.615374 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" containerName="dnsmasq-dns" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.616555 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.608455 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-config-data\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.635240 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-scripts\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.635513 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-public-tls-certs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.651989 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-854d8d6bf4-kknjq"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.657007 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx85f\" (UniqueName: \"kubernetes.io/projected/38fe889d-dbc7-448a-ade6-7847b16f85d2-kube-api-access-sx85f\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.663700 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-combined-ca-bundle\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.664325 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-internal-tls-certs\") pod \"placement-76f79c8d94-r2gfk\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:22 crc kubenswrapper[4919]: W0310 22:12:22.688551 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98b65b94_9415_4a0b_acb7_760a536d250a.slice/crio-24977a6a6c7357fdad82fb65e7391073dfce84fa5961a206b3d149b7fa19023c WatchSource:0}: Error finding container 24977a6a6c7357fdad82fb65e7391073dfce84fa5961a206b3d149b7fa19023c: Status 404 returned error can't find the container with id 24977a6a6c7357fdad82fb65e7391073dfce84fa5961a206b3d149b7fa19023c Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.688693 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.708768 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-config" (OuterVolumeSpecName: "config") pod "43b80b86-6652-4a1a-8be6-7a5643e0bb45" (UID: "43b80b86-6652-4a1a-8be6-7a5643e0bb45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.722528 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43b80b86-6652-4a1a-8be6-7a5643e0bb45" (UID: "43b80b86-6652-4a1a-8be6-7a5643e0bb45"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.731500 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "43b80b86-6652-4a1a-8be6-7a5643e0bb45" (UID: "43b80b86-6652-4a1a-8be6-7a5643e0bb45"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.733937 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43b80b86-6652-4a1a-8be6-7a5643e0bb45" (UID: "43b80b86-6652-4a1a-8be6-7a5643e0bb45"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.756095 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43b80b86-6652-4a1a-8be6-7a5643e0bb45" (UID: "43b80b86-6652-4a1a-8be6-7a5643e0bb45"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.779774 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-config-data\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.780819 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-public-tls-certs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.780899 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-logs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.780982 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-internal-tls-certs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.781079 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-combined-ca-bundle\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.781293 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md4gv\" (UniqueName: \"kubernetes.io/projected/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-kube-api-access-md4gv\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.781448 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-scripts\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.781574 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.781643 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.781710 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.781765 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.781883 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43b80b86-6652-4a1a-8be6-7a5643e0bb45-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.832861 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.878520 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84fbf8d4df-qnkcp"] Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.884900 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-combined-ca-bundle\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.884957 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md4gv\" (UniqueName: \"kubernetes.io/projected/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-kube-api-access-md4gv\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.885037 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-scripts\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.885073 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-config-data\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.885120 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-public-tls-certs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.885138 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-logs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.885165 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-internal-tls-certs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.890772 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-logs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.898333 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-scripts\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.899052 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-combined-ca-bundle\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.900832 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-internal-tls-certs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.901148 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-public-tls-certs\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.911821 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-config-data\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.916433 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md4gv\" (UniqueName: \"kubernetes.io/projected/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-kube-api-access-md4gv\") pod \"placement-854d8d6bf4-kknjq\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:22 crc kubenswrapper[4919]: I0310 22:12:22.945117 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.029907 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.042069 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5bdd68856d-p865v"] Mar 10 22:12:23 crc kubenswrapper[4919]: W0310 22:12:23.163239 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c6e8e16_fa38_44f1_8a47_c6130972b034.slice/crio-45f0b320c64ae1c5779b69ab4ab418b0bbf5f4d5d3cd6e519adaaafacf60de8d WatchSource:0}: Error finding container 45f0b320c64ae1c5779b69ab4ab418b0bbf5f4d5d3cd6e519adaaafacf60de8d: Status 404 returned error can't find the container with id 45f0b320c64ae1c5779b69ab4ab418b0bbf5f4d5d3cd6e519adaaafacf60de8d Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.166960 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98cfc95fc-fjth4"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.259801 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" event={"ID":"98b65b94-9415-4a0b-acb7-760a536d250a","Type":"ContainerStarted","Data":"24977a6a6c7357fdad82fb65e7391073dfce84fa5961a206b3d149b7fa19023c"} Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.272824 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" event={"ID":"0c6e8e16-fa38-44f1-8a47-c6130972b034","Type":"ContainerStarted","Data":"45f0b320c64ae1c5779b69ab4ab418b0bbf5f4d5d3cd6e519adaaafacf60de8d"} Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.288759 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" event={"ID":"43b80b86-6652-4a1a-8be6-7a5643e0bb45","Type":"ContainerDied","Data":"ec6a91fad2a866d726aca47b0a0357c857e2d0a7473cc5914cbb437bd5306b6e"} Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.288836 4919 scope.go:117] "RemoveContainer" containerID="1ded035287266e127df1c58824ea62be42b420eea53e61da848af410bacfa67d" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.288767 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66ff44db99-v77sd" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.292655 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" event={"ID":"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758","Type":"ContainerStarted","Data":"b0b21bd0053b8a93a7c1d7438be124b6f235afca536b61172ee196ec3cb303b9"} Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.303764 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bdd68856d-p865v" event={"ID":"1372c543-234e-41a0-936c-d17cdf422557","Type":"ContainerStarted","Data":"c28d27639234ea92cc3e7f545511a7d388c747c9d02aa121c5d7dfc048ff89d5"} Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.361803 4919 scope.go:117] "RemoveContainer" containerID="c48acd8ef511580ead42ad57aaab08a3dce73adeaff3b152fe955a3be9712daa" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.415011 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-v77sd"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.455120 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66ff44db99-v77sd"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.531124 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1705da29-6c23-4b70-881e-7e8268197e07" path="/var/lib/kubelet/pods/1705da29-6c23-4b70-881e-7e8268197e07/volumes" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.531806 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b80b86-6652-4a1a-8be6-7a5643e0bb45" path="/var/lib/kubelet/pods/43b80b86-6652-4a1a-8be6-7a5643e0bb45/volumes" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.532367 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-68449cb44c-wmmzf"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.534089 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68449cb44c-wmmzf"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.534114 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-fd8f54c58-gtj5m"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.535217 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.536033 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.563481 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-fd8f54c58-gtj5m"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.586990 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-599f4d795-pgnpd"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.608648 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data-custom\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.608683 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.608724 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84sz2\" (UniqueName: \"kubernetes.io/projected/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-kube-api-access-84sz2\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.608751 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data-custom\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.608801 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-combined-ca-bundle\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.608882 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31690f34-6b68-4470-a13e-e16121ec25d2-logs\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.608955 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-logs\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.608994 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.609036 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.609137 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25lf7\" (UniqueName: \"kubernetes.io/projected/31690f34-6b68-4470-a13e-e16121ec25d2-kube-api-access-25lf7\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.623154 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-f7b6c97c6-bwscr"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.627030 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.662693 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f7b6c97c6-bwscr"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.710377 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25lf7\" (UniqueName: \"kubernetes.io/projected/31690f34-6b68-4470-a13e-e16121ec25d2-kube-api-access-25lf7\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.710446 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24nr\" (UniqueName: \"kubernetes.io/projected/9fae91e8-30c8-4b1d-a243-c2cc58100766-kube-api-access-q24nr\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.710484 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data-custom\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.710505 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.710526 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84sz2\" (UniqueName: \"kubernetes.io/projected/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-kube-api-access-84sz2\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.710547 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data-custom\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.711801 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data-custom\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.711840 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-combined-ca-bundle\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.711902 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31690f34-6b68-4470-a13e-e16121ec25d2-logs\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.711923 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-combined-ca-bundle\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.711971 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fae91e8-30c8-4b1d-a243-c2cc58100766-logs\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.711997 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-logs\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.712036 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.712063 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.712108 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.713067 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31690f34-6b68-4470-a13e-e16121ec25d2-logs\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.716734 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-logs\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.717494 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.717538 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data-custom\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.717712 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.721968 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data-custom\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.731600 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.740939 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25lf7\" (UniqueName: \"kubernetes.io/projected/31690f34-6b68-4470-a13e-e16121ec25d2-kube-api-access-25lf7\") pod \"barbican-keystone-listener-fd8f54c58-gtj5m\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.741012 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76f79c8d94-r2gfk"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.742211 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-combined-ca-bundle\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.744334 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84sz2\" (UniqueName: \"kubernetes.io/projected/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-kube-api-access-84sz2\") pod \"barbican-worker-68449cb44c-wmmzf\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.821411 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-combined-ca-bundle\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.821528 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fae91e8-30c8-4b1d-a243-c2cc58100766-logs\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.821987 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.822107 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q24nr\" (UniqueName: \"kubernetes.io/projected/9fae91e8-30c8-4b1d-a243-c2cc58100766-kube-api-access-q24nr\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.822262 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data-custom\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.860313 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-854d8d6bf4-kknjq"] Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.891958 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.915717 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data-custom\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.915778 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24nr\" (UniqueName: \"kubernetes.io/projected/9fae91e8-30c8-4b1d-a243-c2cc58100766-kube-api-access-q24nr\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.915898 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-combined-ca-bundle\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.917234 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.917747 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:12:23 crc kubenswrapper[4919]: W0310 22:12:23.957265 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod981bb03c_23be_4bf8_a9f6_cb8a552f66a5.slice/crio-cf163b74a00b3fc9c5f8e802c09f733ed4f45ba0e95aadd586f4fd561306b85d WatchSource:0}: Error finding container cf163b74a00b3fc9c5f8e802c09f733ed4f45ba0e95aadd586f4fd561306b85d: Status 404 returned error can't find the container with id cf163b74a00b3fc9c5f8e802c09f733ed4f45ba0e95aadd586f4fd561306b85d Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.962821 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fae91e8-30c8-4b1d-a243-c2cc58100766-logs\") pod \"barbican-api-f7b6c97c6-bwscr\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:23 crc kubenswrapper[4919]: I0310 22:12:23.973724 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.338447 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-599f4d795-pgnpd" event={"ID":"408722a8-2c8a-4bda-82d5-1d2f58bda7d7","Type":"ContainerStarted","Data":"5ff781be5fb816909a57c89be66125e402a38480598bb68dfb13293aa870a6de"} Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.352850 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7050d40c-b959-48a8-b21f-b9f5e308c920","Type":"ContainerStarted","Data":"b4663f7d7e0ab4279572096b43ee1b65d11f4de19b52fe14f7d2d0fbaf38a65d"} Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.363706 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f79c8d94-r2gfk" event={"ID":"38fe889d-dbc7-448a-ade6-7847b16f85d2","Type":"ContainerStarted","Data":"940b6f73c8166172de358abfcb1a48685b9ca01570f07afeae481190ca086271"} Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.374034 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09d4bc6e-4f9e-4375-a816-2aad9cf376b2","Type":"ContainerStarted","Data":"bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6"} Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.391623 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bdd68856d-p865v" event={"ID":"1372c543-234e-41a0-936c-d17cdf422557","Type":"ContainerStarted","Data":"28ea421799fd93915a79b0dfc701c88abb9b8c143935355b79166bae18f4b953"} Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.397665 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854d8d6bf4-kknjq" event={"ID":"981bb03c-23be-4bf8-a9f6-cb8a552f66a5","Type":"ContainerStarted","Data":"cf163b74a00b3fc9c5f8e802c09f733ed4f45ba0e95aadd586f4fd561306b85d"} Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.399837 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" event={"ID":"0c6e8e16-fa38-44f1-8a47-c6130972b034","Type":"ContainerStarted","Data":"8a2d912a5fd8f12859d8831e07c924875a58f0d1849dd7977578ff8fe68896de"} Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.578507 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68449cb44c-wmmzf"] Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.697964 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-fd8f54c58-gtj5m"] Mar 10 22:12:24 crc kubenswrapper[4919]: W0310 22:12:24.751059 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fae91e8_30c8_4b1d_a243_c2cc58100766.slice/crio-6a1a058aa8226641521e2d52d97591183471aeebed180d6054b2206eb2cae43d WatchSource:0}: Error finding container 6a1a058aa8226641521e2d52d97591183471aeebed180d6054b2206eb2cae43d: Status 404 returned error can't find the container with id 6a1a058aa8226641521e2d52d97591183471aeebed180d6054b2206eb2cae43d Mar 10 22:12:24 crc kubenswrapper[4919]: I0310 22:12:24.755222 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f7b6c97c6-bwscr"] Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.298860 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bdd68856d-p865v"] Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.329218 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75f54b97c6-fj5s7"] Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.330624 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.334851 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.335062 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.369165 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.369326 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-internal-tls-certs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.369463 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-public-tls-certs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.369604 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-combined-ca-bundle\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.369632 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8bc\" (UniqueName: \"kubernetes.io/projected/28d81dfb-640f-4748-ab70-e0b393e1e595-kube-api-access-7f8bc\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.369667 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data-custom\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.369721 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d81dfb-640f-4748-ab70-e0b393e1e595-logs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.372798 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75f54b97c6-fj5s7"] Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.418621 4919 generic.go:334] "Generic (PLEG): container finished" podID="0c6e8e16-fa38-44f1-8a47-c6130972b034" containerID="8a2d912a5fd8f12859d8831e07c924875a58f0d1849dd7977578ff8fe68896de" exitCode=0 Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.418688 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" event={"ID":"0c6e8e16-fa38-44f1-8a47-c6130972b034","Type":"ContainerDied","Data":"8a2d912a5fd8f12859d8831e07c924875a58f0d1849dd7977578ff8fe68896de"} Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.423689 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f7b6c97c6-bwscr" event={"ID":"9fae91e8-30c8-4b1d-a243-c2cc58100766","Type":"ContainerStarted","Data":"6a1a058aa8226641521e2d52d97591183471aeebed180d6054b2206eb2cae43d"} Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.432683 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-599f4d795-pgnpd" event={"ID":"408722a8-2c8a-4bda-82d5-1d2f58bda7d7","Type":"ContainerStarted","Data":"d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7"} Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.433348 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.437845 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f79c8d94-r2gfk" event={"ID":"38fe889d-dbc7-448a-ade6-7847b16f85d2","Type":"ContainerStarted","Data":"b41b9dc117a4ea3985d0420d76edc4c3f9d048760b07300843dce013e6bbe4cc"} Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.439138 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854d8d6bf4-kknjq" event={"ID":"981bb03c-23be-4bf8-a9f6-cb8a552f66a5","Type":"ContainerStarted","Data":"e4d70f78ccff4f0649cd6b2f0b66c626faab3e22bb0695b32e2d8f790b8b831d"} Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.439895 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68449cb44c-wmmzf" event={"ID":"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0","Type":"ContainerStarted","Data":"86bd79f50ff0d5fef2fbc0b8bd95d50c28ad34e052fd53736b71a374fa81dc64"} Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.443758 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" event={"ID":"31690f34-6b68-4470-a13e-e16121ec25d2","Type":"ContainerStarted","Data":"3cafad43b499322396f941900bf49a47d1ab90bc7b553d3f2f80bfff3b36dfe5"} Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.461646 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-599f4d795-pgnpd" podStartSLOduration=3.46162783 podStartE2EDuration="3.46162783s" podCreationTimestamp="2026-03-10 22:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:25.456296976 +0000 UTC m=+1332.698177584" watchObservedRunningTime="2026-03-10 22:12:25.46162783 +0000 UTC m=+1332.703508438" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.470911 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-public-tls-certs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.471352 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-combined-ca-bundle\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.471383 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8bc\" (UniqueName: \"kubernetes.io/projected/28d81dfb-640f-4748-ab70-e0b393e1e595-kube-api-access-7f8bc\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.471441 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data-custom\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.471482 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d81dfb-640f-4748-ab70-e0b393e1e595-logs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.471679 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.471736 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-internal-tls-certs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.472829 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d81dfb-640f-4748-ab70-e0b393e1e595-logs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.475674 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-internal-tls-certs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.475811 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-combined-ca-bundle\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.476281 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.480090 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data-custom\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.480915 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-public-tls-certs\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.491510 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8bc\" (UniqueName: \"kubernetes.io/projected/28d81dfb-640f-4748-ab70-e0b393e1e595-kube-api-access-7f8bc\") pod \"barbican-api-75f54b97c6-fj5s7\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:25 crc kubenswrapper[4919]: I0310 22:12:25.654004 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:26 crc kubenswrapper[4919]: I0310 22:12:26.115135 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75f54b97c6-fj5s7"] Mar 10 22:12:26 crc kubenswrapper[4919]: W0310 22:12:26.132195 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28d81dfb_640f_4748_ab70_e0b393e1e595.slice/crio-31a2f38b1aa47f4dfe221b42d83a3e6d2c6d06a0c2690f09cd1a5afb08ce4465 WatchSource:0}: Error finding container 31a2f38b1aa47f4dfe221b42d83a3e6d2c6d06a0c2690f09cd1a5afb08ce4465: Status 404 returned error can't find the container with id 31a2f38b1aa47f4dfe221b42d83a3e6d2c6d06a0c2690f09cd1a5afb08ce4465 Mar 10 22:12:26 crc kubenswrapper[4919]: I0310 22:12:26.454562 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f7b6c97c6-bwscr" event={"ID":"9fae91e8-30c8-4b1d-a243-c2cc58100766","Type":"ContainerStarted","Data":"2b30e15d87c6fddad7f6c0a030db8b5f1462206ab3c95932ffc9a7cef934c3ef"} Mar 10 22:12:26 crc kubenswrapper[4919]: I0310 22:12:26.456185 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09d4bc6e-4f9e-4375-a816-2aad9cf376b2","Type":"ContainerStarted","Data":"c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24"} Mar 10 22:12:26 crc kubenswrapper[4919]: I0310 22:12:26.457767 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bdd68856d-p865v" event={"ID":"1372c543-234e-41a0-936c-d17cdf422557","Type":"ContainerStarted","Data":"2a1dde30ee60f04c9661f8f26a86bd29b51925be1332eaadab99a23ba68de6c7"} Mar 10 22:12:26 crc kubenswrapper[4919]: I0310 22:12:26.458767 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f54b97c6-fj5s7" event={"ID":"28d81dfb-640f-4748-ab70-e0b393e1e595","Type":"ContainerStarted","Data":"31a2f38b1aa47f4dfe221b42d83a3e6d2c6d06a0c2690f09cd1a5afb08ce4465"} Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.475453 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854d8d6bf4-kknjq" event={"ID":"981bb03c-23be-4bf8-a9f6-cb8a552f66a5","Type":"ContainerStarted","Data":"d88c0958bf40600808f9977f231a5fa1e34419ab9909560a692746f53c31f4f0"} Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.484222 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bdd68856d-p865v" podUID="1372c543-234e-41a0-936c-d17cdf422557" containerName="barbican-api-log" containerID="cri-o://28ea421799fd93915a79b0dfc701c88abb9b8c143935355b79166bae18f4b953" gracePeriod=30 Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.484604 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5bdd68856d-p865v" podUID="1372c543-234e-41a0-936c-d17cdf422557" containerName="barbican-api" containerID="cri-o://2a1dde30ee60f04c9661f8f26a86bd29b51925be1332eaadab99a23ba68de6c7" gracePeriod=30 Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.495048 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.495091 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.495102 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" event={"ID":"0c6e8e16-fa38-44f1-8a47-c6130972b034","Type":"ContainerStarted","Data":"066e5c6c728a8a11a1ef807f6807f233d6540526942d3d96b951887a7e7d7b9b"} Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.495123 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7050d40c-b959-48a8-b21f-b9f5e308c920","Type":"ContainerStarted","Data":"2734d9e2168cf611b701bf7332456b621e4b8962de1b13492fc28d84fb7815b5"} Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.495137 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f79c8d94-r2gfk" event={"ID":"38fe889d-dbc7-448a-ade6-7847b16f85d2","Type":"ContainerStarted","Data":"22f47a828dbe40df452a7740f55740269c4d5812ffec016b63b10dce5fbf14e6"} Mar 10 22:12:27 crc kubenswrapper[4919]: I0310 22:12:27.524469 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5bdd68856d-p865v" podStartSLOduration=6.524454232 podStartE2EDuration="6.524454232s" podCreationTimestamp="2026-03-10 22:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:27.519798576 +0000 UTC m=+1334.761679184" watchObservedRunningTime="2026-03-10 22:12:27.524454232 +0000 UTC m=+1334.766334840" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.504778 4919 generic.go:334] "Generic (PLEG): container finished" podID="1372c543-234e-41a0-936c-d17cdf422557" containerID="2a1dde30ee60f04c9661f8f26a86bd29b51925be1332eaadab99a23ba68de6c7" exitCode=0 Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.505115 4919 generic.go:334] "Generic (PLEG): container finished" podID="1372c543-234e-41a0-936c-d17cdf422557" containerID="28ea421799fd93915a79b0dfc701c88abb9b8c143935355b79166bae18f4b953" exitCode=143 Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.504879 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bdd68856d-p865v" event={"ID":"1372c543-234e-41a0-936c-d17cdf422557","Type":"ContainerDied","Data":"2a1dde30ee60f04c9661f8f26a86bd29b51925be1332eaadab99a23ba68de6c7"} Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.505179 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bdd68856d-p865v" event={"ID":"1372c543-234e-41a0-936c-d17cdf422557","Type":"ContainerDied","Data":"28ea421799fd93915a79b0dfc701c88abb9b8c143935355b79166bae18f4b953"} Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.509690 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f54b97c6-fj5s7" event={"ID":"28d81dfb-640f-4748-ab70-e0b393e1e595","Type":"ContainerStarted","Data":"4f419ebd70f99390116647c66037cbdcde78f060d7d9ba4c1e4bafbc7b53452c"} Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.510066 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.510099 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.535954 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.535994 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.538820 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76f79c8d94-r2gfk" podStartSLOduration=6.538803054 podStartE2EDuration="6.538803054s" podCreationTimestamp="2026-03-10 22:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:28.526615964 +0000 UTC m=+1335.768496582" watchObservedRunningTime="2026-03-10 22:12:28.538803054 +0000 UTC m=+1335.780683662" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.560895 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.560859802 podStartE2EDuration="10.560859802s" podCreationTimestamp="2026-03-10 22:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:28.547609692 +0000 UTC m=+1335.789490300" watchObservedRunningTime="2026-03-10 22:12:28.560859802 +0000 UTC m=+1335.802740430" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.580698 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" podStartSLOduration=7.580681449 podStartE2EDuration="7.580681449s" podCreationTimestamp="2026-03-10 22:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:28.578973503 +0000 UTC m=+1335.820854111" watchObservedRunningTime="2026-03-10 22:12:28.580681449 +0000 UTC m=+1335.822562057" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.582807 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.588449 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.589545 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.589609 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.603870 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.603849886 podStartE2EDuration="10.603849886s" podCreationTimestamp="2026-03-10 22:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:28.602433708 +0000 UTC m=+1335.844314326" watchObservedRunningTime="2026-03-10 22:12:28.603849886 +0000 UTC m=+1335.845730494" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.627565 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:28 crc kubenswrapper[4919]: I0310 22:12:28.651925 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.192375 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.194994 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.572964 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-854d8d6bf4-kknjq" podStartSLOduration=7.572930843 podStartE2EDuration="7.572930843s" podCreationTimestamp="2026-03-10 22:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:29.564778452 +0000 UTC m=+1336.806659100" watchObservedRunningTime="2026-03-10 22:12:29.572930843 +0000 UTC m=+1336.814811461" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.587825 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-f7b6c97c6-bwscr" podStartSLOduration=6.587805445 podStartE2EDuration="6.587805445s" podCreationTimestamp="2026-03-10 22:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:29.586666924 +0000 UTC m=+1336.828547532" watchObservedRunningTime="2026-03-10 22:12:29.587805445 +0000 UTC m=+1336.829686053" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.743200 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f7b6c97c6-bwscr" event={"ID":"9fae91e8-30c8-4b1d-a243-c2cc58100766","Type":"ContainerStarted","Data":"aebd460c1524d22d0ce665b0b841469a1c52d0bdede4cadc1c9e4e9f42ec6354"} Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.743289 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.743308 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.743320 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.743331 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.788916 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.971149 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372c543-234e-41a0-936c-d17cdf422557-logs\") pod \"1372c543-234e-41a0-936c-d17cdf422557\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.971562 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data\") pod \"1372c543-234e-41a0-936c-d17cdf422557\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.971700 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djwq4\" (UniqueName: \"kubernetes.io/projected/1372c543-234e-41a0-936c-d17cdf422557-kube-api-access-djwq4\") pod \"1372c543-234e-41a0-936c-d17cdf422557\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.971959 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-combined-ca-bundle\") pod \"1372c543-234e-41a0-936c-d17cdf422557\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " Mar 10 22:12:29 crc kubenswrapper[4919]: I0310 22:12:29.972114 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data-custom\") pod \"1372c543-234e-41a0-936c-d17cdf422557\" (UID: \"1372c543-234e-41a0-936c-d17cdf422557\") " Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.152753 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1372c543-234e-41a0-936c-d17cdf422557-logs" (OuterVolumeSpecName: "logs") pod "1372c543-234e-41a0-936c-d17cdf422557" (UID: "1372c543-234e-41a0-936c-d17cdf422557"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.156544 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data" (OuterVolumeSpecName: "config-data") pod "1372c543-234e-41a0-936c-d17cdf422557" (UID: "1372c543-234e-41a0-936c-d17cdf422557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.156577 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1372c543-234e-41a0-936c-d17cdf422557" (UID: "1372c543-234e-41a0-936c-d17cdf422557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.156727 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1372c543-234e-41a0-936c-d17cdf422557" (UID: "1372c543-234e-41a0-936c-d17cdf422557"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.156582 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1372c543-234e-41a0-936c-d17cdf422557-kube-api-access-djwq4" (OuterVolumeSpecName: "kube-api-access-djwq4") pod "1372c543-234e-41a0-936c-d17cdf422557" (UID: "1372c543-234e-41a0-936c-d17cdf422557"). InnerVolumeSpecName "kube-api-access-djwq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.177637 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.177741 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djwq4\" (UniqueName: \"kubernetes.io/projected/1372c543-234e-41a0-936c-d17cdf422557-kube-api-access-djwq4\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.177759 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.177771 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1372c543-234e-41a0-936c-d17cdf422557-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.177782 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1372c543-234e-41a0-936c-d17cdf422557-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.576423 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5bdd68856d-p865v" event={"ID":"1372c543-234e-41a0-936c-d17cdf422557","Type":"ContainerDied","Data":"c28d27639234ea92cc3e7f545511a7d388c747c9d02aa121c5d7dfc048ff89d5"} Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.576794 4919 scope.go:117] "RemoveContainer" containerID="2a1dde30ee60f04c9661f8f26a86bd29b51925be1332eaadab99a23ba68de6c7" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.576671 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5bdd68856d-p865v" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.586640 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f54b97c6-fj5s7" event={"ID":"28d81dfb-640f-4748-ab70-e0b393e1e595","Type":"ContainerStarted","Data":"fca95d527891107e2d3047ae093c871e26b3b61e0a45e89186497f024bbb6624"} Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.587829 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.587899 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.587922 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.587938 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.633840 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75f54b97c6-fj5s7" podStartSLOduration=5.633797305 podStartE2EDuration="5.633797305s" podCreationTimestamp="2026-03-10 22:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:30.615308034 +0000 UTC m=+1337.857188642" watchObservedRunningTime="2026-03-10 22:12:30.633797305 +0000 UTC m=+1337.875677913" Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.649152 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5bdd68856d-p865v"] Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.662078 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5bdd68856d-p865v"] Mar 10 22:12:30 crc kubenswrapper[4919]: I0310 22:12:30.972621 4919 scope.go:117] "RemoveContainer" containerID="28ea421799fd93915a79b0dfc701c88abb9b8c143935355b79166bae18f4b953" Mar 10 22:12:31 crc kubenswrapper[4919]: I0310 22:12:31.006092 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:31 crc kubenswrapper[4919]: I0310 22:12:31.501813 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1372c543-234e-41a0-936c-d17cdf422557" path="/var/lib/kubelet/pods/1372c543-234e-41a0-936c-d17cdf422557/volumes" Mar 10 22:12:31 crc kubenswrapper[4919]: I0310 22:12:31.510196 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:31 crc kubenswrapper[4919]: I0310 22:12:31.597113 4919 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 22:12:31 crc kubenswrapper[4919]: I0310 22:12:31.950931 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:31 crc kubenswrapper[4919]: I0310 22:12:31.951624 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:32 crc kubenswrapper[4919]: I0310 22:12:32.021341 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-bxf6w"] Mar 10 22:12:32 crc kubenswrapper[4919]: I0310 22:12:32.021646 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" podUID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerName="dnsmasq-dns" containerID="cri-o://a1c98cb48521f20303a44e0f87321846803fe8c851e0a4d4b5f139c570f1c0b6" gracePeriod=10 Mar 10 22:12:32 crc kubenswrapper[4919]: I0310 22:12:32.262221 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 22:12:32 crc kubenswrapper[4919]: I0310 22:12:32.489318 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:32 crc kubenswrapper[4919]: I0310 22:12:32.608616 4919 generic.go:334] "Generic (PLEG): container finished" podID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerID="a1c98cb48521f20303a44e0f87321846803fe8c851e0a4d4b5f139c570f1c0b6" exitCode=0 Mar 10 22:12:32 crc kubenswrapper[4919]: I0310 22:12:32.610071 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" event={"ID":"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b","Type":"ContainerDied","Data":"a1c98cb48521f20303a44e0f87321846803fe8c851e0a4d4b5f139c570f1c0b6"} Mar 10 22:12:33 crc kubenswrapper[4919]: I0310 22:12:33.982454 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 22:12:34 crc kubenswrapper[4919]: I0310 22:12:34.111464 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:34 crc kubenswrapper[4919]: I0310 22:12:34.265545 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 22:12:34 crc kubenswrapper[4919]: I0310 22:12:34.526015 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" podUID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Mar 10 22:12:35 crc kubenswrapper[4919]: I0310 22:12:35.032867 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:35 crc kubenswrapper[4919]: I0310 22:12:35.765270 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.489132 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.534962 4919 scope.go:117] "RemoveContainer" containerID="e96f1bee3c9c99f93c624b934359fe8d0bc3b475634cab74960b17882e799afb" Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.571613 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-f7b6c97c6-bwscr"] Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.671540 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-f7b6c97c6-bwscr" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api-log" containerID="cri-o://2b30e15d87c6fddad7f6c0a030db8b5f1462206ab3c95932ffc9a7cef934c3ef" gracePeriod=30 Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.671966 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-f7b6c97c6-bwscr" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api" containerID="cri-o://aebd460c1524d22d0ce665b0b841469a1c52d0bdede4cadc1c9e4e9f42ec6354" gracePeriod=30 Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.692696 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-f7b6c97c6-bwscr" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.693066 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f7b6c97c6-bwscr" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.692918 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f7b6c97c6-bwscr" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Mar 10 22:12:36 crc kubenswrapper[4919]: I0310 22:12:36.692790 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-f7b6c97c6-bwscr" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Mar 10 22:12:37 crc kubenswrapper[4919]: I0310 22:12:37.689293 4919 generic.go:334] "Generic (PLEG): container finished" podID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerID="2b30e15d87c6fddad7f6c0a030db8b5f1462206ab3c95932ffc9a7cef934c3ef" exitCode=143 Mar 10 22:12:37 crc kubenswrapper[4919]: I0310 22:12:37.689345 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f7b6c97c6-bwscr" event={"ID":"9fae91e8-30c8-4b1d-a243-c2cc58100766","Type":"ContainerDied","Data":"2b30e15d87c6fddad7f6c0a030db8b5f1462206ab3c95932ffc9a7cef934c3ef"} Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.408179 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.514758 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-nb\") pod \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.514828 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49nvv\" (UniqueName: \"kubernetes.io/projected/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-kube-api-access-49nvv\") pod \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.515013 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-config\") pod \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.515072 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-swift-storage-0\") pod \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.515136 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-sb\") pod \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.515167 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-svc\") pod \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\" (UID: \"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b\") " Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.521300 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-kube-api-access-49nvv" (OuterVolumeSpecName: "kube-api-access-49nvv") pod "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" (UID: "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b"). InnerVolumeSpecName "kube-api-access-49nvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.563686 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" (UID: "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.568745 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" (UID: "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.572979 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" (UID: "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.603328 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" (UID: "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.604882 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-config" (OuterVolumeSpecName: "config") pod "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" (UID: "d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.617218 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.617261 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49nvv\" (UniqueName: \"kubernetes.io/projected/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-kube-api-access-49nvv\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.617279 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.617294 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.617312 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.617327 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.699677 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" event={"ID":"d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b","Type":"ContainerDied","Data":"65db7f15ece34513ec2cf14efea17490829cd3b67811ba9ec02c293c7c36bc96"} Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.699712 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74f8fb89-bxf6w" Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.738198 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-bxf6w"] Mar 10 22:12:38 crc kubenswrapper[4919]: I0310 22:12:38.748501 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d74f8fb89-bxf6w"] Mar 10 22:12:39 crc kubenswrapper[4919]: I0310 22:12:39.195525 4919 scope.go:117] "RemoveContainer" containerID="a1c98cb48521f20303a44e0f87321846803fe8c851e0a4d4b5f139c570f1c0b6" Mar 10 22:12:39 crc kubenswrapper[4919]: I0310 22:12:39.489723 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" path="/var/lib/kubelet/pods/d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b/volumes" Mar 10 22:12:39 crc kubenswrapper[4919]: I0310 22:12:39.766062 4919 scope.go:117] "RemoveContainer" containerID="9ff133bacb3b951d06d4a258f0292d4d3d314a4d3d137bbfe2ab96bb435b84c7" Mar 10 22:12:40 crc kubenswrapper[4919]: E0310 22:12:40.286698 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.723315 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a35812c5-ffc7-4307-ab31-390c9ee39262","Type":"ContainerStarted","Data":"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.723778 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.723454 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="ceilometer-notification-agent" containerID="cri-o://fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc" gracePeriod=30 Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.723513 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="proxy-httpd" containerID="cri-o://1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df" gracePeriod=30 Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.723529 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="sg-core" containerID="cri-o://145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec" gracePeriod=30 Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.726674 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" event={"ID":"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758","Type":"ContainerStarted","Data":"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.726722 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" event={"ID":"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758","Type":"ContainerStarted","Data":"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.735333 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" event={"ID":"98b65b94-9415-4a0b-acb7-760a536d250a","Type":"ContainerStarted","Data":"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.735403 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" event={"ID":"98b65b94-9415-4a0b-acb7-760a536d250a","Type":"ContainerStarted","Data":"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.737552 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7xmvf" event={"ID":"0376622a-15ed-42d8-98b9-ffa1138134ee","Type":"ContainerStarted","Data":"db27c753dfd8df1b990a2d95eeec2891aa5193e1c72a29e062a7115cc6c131ca"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.752599 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68449cb44c-wmmzf" event={"ID":"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0","Type":"ContainerStarted","Data":"46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.752639 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68449cb44c-wmmzf" event={"ID":"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0","Type":"ContainerStarted","Data":"3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.754940 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" event={"ID":"31690f34-6b68-4470-a13e-e16121ec25d2","Type":"ContainerStarted","Data":"800bb678ea7e53aea235c1816dd3b24e1d5cc3ca3910d7d45b290926f9b56fcd"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.754962 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" event={"ID":"31690f34-6b68-4470-a13e-e16121ec25d2","Type":"ContainerStarted","Data":"94166ce461bf0d2113bc5e17cabdb4512063e9d512e978c63a5719892a6a8251"} Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.769126 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" podStartSLOduration=2.750071608 podStartE2EDuration="19.769106168s" podCreationTimestamp="2026-03-10 22:12:21 +0000 UTC" firstStartedPulling="2026-03-10 22:12:22.722185379 +0000 UTC m=+1329.964065987" lastFinishedPulling="2026-03-10 22:12:39.741219919 +0000 UTC m=+1346.983100547" observedRunningTime="2026-03-10 22:12:40.768130681 +0000 UTC m=+1348.010011289" watchObservedRunningTime="2026-03-10 22:12:40.769106168 +0000 UTC m=+1348.010986776" Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.789348 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-7xmvf" podStartSLOduration=2.916678375 podStartE2EDuration="53.789334006s" podCreationTimestamp="2026-03-10 22:11:47 +0000 UTC" firstStartedPulling="2026-03-10 22:11:48.868565618 +0000 UTC m=+1296.110446236" lastFinishedPulling="2026-03-10 22:12:39.741221249 +0000 UTC m=+1346.983101867" observedRunningTime="2026-03-10 22:12:40.787663331 +0000 UTC m=+1348.029543939" watchObservedRunningTime="2026-03-10 22:12:40.789334006 +0000 UTC m=+1348.031214614" Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.806001 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" podStartSLOduration=2.826748067 podStartE2EDuration="19.805981797s" podCreationTimestamp="2026-03-10 22:12:21 +0000 UTC" firstStartedPulling="2026-03-10 22:12:22.913861283 +0000 UTC m=+1330.155741891" lastFinishedPulling="2026-03-10 22:12:39.893095013 +0000 UTC m=+1347.134975621" observedRunningTime="2026-03-10 22:12:40.804848257 +0000 UTC m=+1348.046728865" watchObservedRunningTime="2026-03-10 22:12:40.805981797 +0000 UTC m=+1348.047862405" Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.824667 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-68449cb44c-wmmzf" podStartSLOduration=2.810866823 podStartE2EDuration="17.824652323s" podCreationTimestamp="2026-03-10 22:12:23 +0000 UTC" firstStartedPulling="2026-03-10 22:12:24.752380285 +0000 UTC m=+1331.994260883" lastFinishedPulling="2026-03-10 22:12:39.766165765 +0000 UTC m=+1347.008046383" observedRunningTime="2026-03-10 22:12:40.821774916 +0000 UTC m=+1348.063655514" watchObservedRunningTime="2026-03-10 22:12:40.824652323 +0000 UTC m=+1348.066532931" Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.848923 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" podStartSLOduration=2.74176429 podStartE2EDuration="17.84890442s" podCreationTimestamp="2026-03-10 22:12:23 +0000 UTC" firstStartedPulling="2026-03-10 22:12:24.752641652 +0000 UTC m=+1331.994522260" lastFinishedPulling="2026-03-10 22:12:39.859781772 +0000 UTC m=+1347.101662390" observedRunningTime="2026-03-10 22:12:40.841855239 +0000 UTC m=+1348.083735867" watchObservedRunningTime="2026-03-10 22:12:40.84890442 +0000 UTC m=+1348.090785028" Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.859692 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-84fbf8d4df-qnkcp"] Mar 10 22:12:40 crc kubenswrapper[4919]: I0310 22:12:40.881932 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp"] Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.158303 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f7b6c97c6-bwscr" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:51484->10.217.0.166:9311: read: connection reset by peer" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.158345 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f7b6c97c6-bwscr" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:51486->10.217.0.166:9311: read: connection reset by peer" Mar 10 22:12:41 crc kubenswrapper[4919]: E0310 22:12:41.205972 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda35812c5_ffc7_4307_ab31_390c9ee39262.slice/crio-conmon-fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc.scope\": RecentStats: unable to find data in memory cache]" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.769349 4919 generic.go:334] "Generic (PLEG): container finished" podID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerID="aebd460c1524d22d0ce665b0b841469a1c52d0bdede4cadc1c9e4e9f42ec6354" exitCode=0 Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.769422 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f7b6c97c6-bwscr" event={"ID":"9fae91e8-30c8-4b1d-a243-c2cc58100766","Type":"ContainerDied","Data":"aebd460c1524d22d0ce665b0b841469a1c52d0bdede4cadc1c9e4e9f42ec6354"} Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.769737 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f7b6c97c6-bwscr" event={"ID":"9fae91e8-30c8-4b1d-a243-c2cc58100766","Type":"ContainerDied","Data":"6a1a058aa8226641521e2d52d97591183471aeebed180d6054b2206eb2cae43d"} Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.769752 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a1a058aa8226641521e2d52d97591183471aeebed180d6054b2206eb2cae43d" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.773806 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.776297 4919 generic.go:334] "Generic (PLEG): container finished" podID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerID="1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df" exitCode=0 Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.776326 4919 generic.go:334] "Generic (PLEG): container finished" podID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerID="145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec" exitCode=2 Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.776353 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a35812c5-ffc7-4307-ab31-390c9ee39262","Type":"ContainerDied","Data":"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df"} Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.776384 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a35812c5-ffc7-4307-ab31-390c9ee39262","Type":"ContainerDied","Data":"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec"} Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.776419 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a35812c5-ffc7-4307-ab31-390c9ee39262","Type":"ContainerDied","Data":"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc"} Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.776434 4919 scope.go:117] "RemoveContainer" containerID="1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.776364 4919 generic.go:334] "Generic (PLEG): container finished" podID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerID="fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc" exitCode=0 Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.776540 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a35812c5-ffc7-4307-ab31-390c9ee39262","Type":"ContainerDied","Data":"24b773789dcd5611dbb3954ea8be750ba760bcc5a05ebaf48d75bd37a179eb15"} Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.793298 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.815557 4919 scope.go:117] "RemoveContainer" containerID="145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.867581 4919 scope.go:117] "RemoveContainer" containerID="fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.897001 4919 scope.go:117] "RemoveContainer" containerID="1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df" Mar 10 22:12:41 crc kubenswrapper[4919]: E0310 22:12:41.897359 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df\": container with ID starting with 1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df not found: ID does not exist" containerID="1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.897405 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df"} err="failed to get container status \"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df\": rpc error: code = NotFound desc = could not find container \"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df\": container with ID starting with 1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.897426 4919 scope.go:117] "RemoveContainer" containerID="145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec" Mar 10 22:12:41 crc kubenswrapper[4919]: E0310 22:12:41.897622 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec\": container with ID starting with 145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec not found: ID does not exist" containerID="145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.897699 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec"} err="failed to get container status \"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec\": rpc error: code = NotFound desc = could not find container \"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec\": container with ID starting with 145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.897713 4919 scope.go:117] "RemoveContainer" containerID="fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc" Mar 10 22:12:41 crc kubenswrapper[4919]: E0310 22:12:41.897877 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc\": container with ID starting with fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc not found: ID does not exist" containerID="fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.897915 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc"} err="failed to get container status \"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc\": rpc error: code = NotFound desc = could not find container \"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc\": container with ID starting with fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.897927 4919 scope.go:117] "RemoveContainer" containerID="1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898075 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df"} err="failed to get container status \"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df\": rpc error: code = NotFound desc = could not find container \"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df\": container with ID starting with 1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898096 4919 scope.go:117] "RemoveContainer" containerID="145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898238 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec"} err="failed to get container status \"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec\": rpc error: code = NotFound desc = could not find container \"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec\": container with ID starting with 145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898258 4919 scope.go:117] "RemoveContainer" containerID="fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898420 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc"} err="failed to get container status \"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc\": rpc error: code = NotFound desc = could not find container \"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc\": container with ID starting with fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898440 4919 scope.go:117] "RemoveContainer" containerID="1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898581 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df"} err="failed to get container status \"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df\": rpc error: code = NotFound desc = could not find container \"1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df\": container with ID starting with 1904f9af5bbea6c9fd56dafb57d72bab54c8655659d75fdeac0f7be7f92205df not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898599 4919 scope.go:117] "RemoveContainer" containerID="145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898739 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec"} err="failed to get container status \"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec\": rpc error: code = NotFound desc = could not find container \"145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec\": container with ID starting with 145a769aad9313c4f561fc922e478b28bdd5bfb8840087519d8e93fadeb600ec not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898759 4919 scope.go:117] "RemoveContainer" containerID="fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.898899 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc"} err="failed to get container status \"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc\": rpc error: code = NotFound desc = could not find container \"fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc\": container with ID starting with fcb531d663ebad33754a1845e9e59fef061180c2f9bcd7c60cf8aaa1dcde51fc not found: ID does not exist" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941494 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fae91e8-30c8-4b1d-a243-c2cc58100766-logs\") pod \"9fae91e8-30c8-4b1d-a243-c2cc58100766\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941551 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-combined-ca-bundle\") pod \"a35812c5-ffc7-4307-ab31-390c9ee39262\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941630 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-log-httpd\") pod \"a35812c5-ffc7-4307-ab31-390c9ee39262\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941665 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-scripts\") pod \"a35812c5-ffc7-4307-ab31-390c9ee39262\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941717 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h9lm\" (UniqueName: \"kubernetes.io/projected/a35812c5-ffc7-4307-ab31-390c9ee39262-kube-api-access-4h9lm\") pod \"a35812c5-ffc7-4307-ab31-390c9ee39262\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941738 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data\") pod \"9fae91e8-30c8-4b1d-a243-c2cc58100766\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941751 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-combined-ca-bundle\") pod \"9fae91e8-30c8-4b1d-a243-c2cc58100766\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941806 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data-custom\") pod \"9fae91e8-30c8-4b1d-a243-c2cc58100766\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941851 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-config-data\") pod \"a35812c5-ffc7-4307-ab31-390c9ee39262\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941903 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q24nr\" (UniqueName: \"kubernetes.io/projected/9fae91e8-30c8-4b1d-a243-c2cc58100766-kube-api-access-q24nr\") pod \"9fae91e8-30c8-4b1d-a243-c2cc58100766\" (UID: \"9fae91e8-30c8-4b1d-a243-c2cc58100766\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941941 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-sg-core-conf-yaml\") pod \"a35812c5-ffc7-4307-ab31-390c9ee39262\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.941971 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-run-httpd\") pod \"a35812c5-ffc7-4307-ab31-390c9ee39262\" (UID: \"a35812c5-ffc7-4307-ab31-390c9ee39262\") " Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.943039 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fae91e8-30c8-4b1d-a243-c2cc58100766-logs" (OuterVolumeSpecName: "logs") pod "9fae91e8-30c8-4b1d-a243-c2cc58100766" (UID: "9fae91e8-30c8-4b1d-a243-c2cc58100766"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.943113 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a35812c5-ffc7-4307-ab31-390c9ee39262" (UID: "a35812c5-ffc7-4307-ab31-390c9ee39262"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.944708 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a35812c5-ffc7-4307-ab31-390c9ee39262" (UID: "a35812c5-ffc7-4307-ab31-390c9ee39262"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.952055 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-scripts" (OuterVolumeSpecName: "scripts") pod "a35812c5-ffc7-4307-ab31-390c9ee39262" (UID: "a35812c5-ffc7-4307-ab31-390c9ee39262"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.953650 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a35812c5-ffc7-4307-ab31-390c9ee39262-kube-api-access-4h9lm" (OuterVolumeSpecName: "kube-api-access-4h9lm") pod "a35812c5-ffc7-4307-ab31-390c9ee39262" (UID: "a35812c5-ffc7-4307-ab31-390c9ee39262"). InnerVolumeSpecName "kube-api-access-4h9lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.963563 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9fae91e8-30c8-4b1d-a243-c2cc58100766" (UID: "9fae91e8-30c8-4b1d-a243-c2cc58100766"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.965556 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fae91e8-30c8-4b1d-a243-c2cc58100766-kube-api-access-q24nr" (OuterVolumeSpecName: "kube-api-access-q24nr") pod "9fae91e8-30c8-4b1d-a243-c2cc58100766" (UID: "9fae91e8-30c8-4b1d-a243-c2cc58100766"). InnerVolumeSpecName "kube-api-access-q24nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:41 crc kubenswrapper[4919]: I0310 22:12:41.991024 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a35812c5-ffc7-4307-ab31-390c9ee39262" (UID: "a35812c5-ffc7-4307-ab31-390c9ee39262"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.017825 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fae91e8-30c8-4b1d-a243-c2cc58100766" (UID: "9fae91e8-30c8-4b1d-a243-c2cc58100766"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.031520 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a35812c5-ffc7-4307-ab31-390c9ee39262" (UID: "a35812c5-ffc7-4307-ab31-390c9ee39262"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.038171 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data" (OuterVolumeSpecName: "config-data") pod "9fae91e8-30c8-4b1d-a243-c2cc58100766" (UID: "9fae91e8-30c8-4b1d-a243-c2cc58100766"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.044967 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.044997 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q24nr\" (UniqueName: \"kubernetes.io/projected/9fae91e8-30c8-4b1d-a243-c2cc58100766-kube-api-access-q24nr\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045043 4919 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045053 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045096 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fae91e8-30c8-4b1d-a243-c2cc58100766-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045106 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045115 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a35812c5-ffc7-4307-ab31-390c9ee39262-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045124 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045133 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h9lm\" (UniqueName: \"kubernetes.io/projected/a35812c5-ffc7-4307-ab31-390c9ee39262-kube-api-access-4h9lm\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045144 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.045152 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fae91e8-30c8-4b1d-a243-c2cc58100766-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.051343 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-config-data" (OuterVolumeSpecName: "config-data") pod "a35812c5-ffc7-4307-ab31-390c9ee39262" (UID: "a35812c5-ffc7-4307-ab31-390c9ee39262"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.147600 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a35812c5-ffc7-4307-ab31-390c9ee39262-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.798790 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f7b6c97c6-bwscr" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.798792 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.798932 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerName="barbican-worker-log" containerID="cri-o://4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9" gracePeriod=30 Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.799059 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" containerName="barbican-keystone-listener-log" containerID="cri-o://2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a" gracePeriod=30 Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.799045 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerName="barbican-worker" containerID="cri-o://d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd" gracePeriod=30 Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.800934 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" containerName="barbican-keystone-listener" containerID="cri-o://699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0" gracePeriod=30 Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.884366 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.900092 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.931296 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-f7b6c97c6-bwscr"] Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.958579 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-f7b6c97c6-bwscr"] Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.966522 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967059 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerName="init" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967085 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerName="init" Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967111 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="ceilometer-notification-agent" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967123 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="ceilometer-notification-agent" Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967136 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api-log" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967142 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api-log" Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967153 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1372c543-234e-41a0-936c-d17cdf422557" containerName="barbican-api-log" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967159 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1372c543-234e-41a0-936c-d17cdf422557" containerName="barbican-api-log" Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967169 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967174 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api" Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967184 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="sg-core" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967190 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="sg-core" Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967201 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1372c543-234e-41a0-936c-d17cdf422557" containerName="barbican-api" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967209 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1372c543-234e-41a0-936c-d17cdf422557" containerName="barbican-api" Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967227 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="proxy-httpd" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967234 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="proxy-httpd" Mar 10 22:12:42 crc kubenswrapper[4919]: E0310 22:12:42.967245 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerName="dnsmasq-dns" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967252 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerName="dnsmasq-dns" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967506 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0851c6f-1817-40b4-a9b1-b0d3b95e9b5b" containerName="dnsmasq-dns" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967521 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="sg-core" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967531 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1372c543-234e-41a0-936c-d17cdf422557" containerName="barbican-api" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967546 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1372c543-234e-41a0-936c-d17cdf422557" containerName="barbican-api-log" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967559 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="ceilometer-notification-agent" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967570 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api-log" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967579 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" containerName="barbican-api" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.967589 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" containerName="proxy-httpd" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.969938 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.972304 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.973037 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-config-data\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.973279 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vf7f\" (UniqueName: \"kubernetes.io/projected/481bd64a-9301-42fa-aa9a-7f6c378917df-kube-api-access-2vf7f\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.973155 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.973203 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.973731 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-scripts\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.973807 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.973873 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-run-httpd\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.973982 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-log-httpd\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:42 crc kubenswrapper[4919]: I0310 22:12:42.993672 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.074478 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-log-httpd\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.074531 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.074562 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-config-data\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.074600 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vf7f\" (UniqueName: \"kubernetes.io/projected/481bd64a-9301-42fa-aa9a-7f6c378917df-kube-api-access-2vf7f\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.074956 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-log-httpd\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.075513 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-scripts\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.075546 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.075570 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-run-httpd\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.075822 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-run-httpd\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.080342 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.080896 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.081233 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-scripts\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.084509 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-config-data\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.093265 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vf7f\" (UniqueName: \"kubernetes.io/projected/481bd64a-9301-42fa-aa9a-7f6c378917df-kube-api-access-2vf7f\") pod \"ceilometer-0\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.301710 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.469779 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.481689 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjdrz\" (UniqueName: \"kubernetes.io/projected/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-kube-api-access-wjdrz\") pod \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.481798 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-combined-ca-bundle\") pod \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.481905 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-logs\") pod \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.482082 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data-custom\") pod \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.482252 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data\") pod \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\" (UID: \"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.482487 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-logs" (OuterVolumeSpecName: "logs") pod "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" (UID: "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.482919 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.486971 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" (UID: "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.487635 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-kube-api-access-wjdrz" (OuterVolumeSpecName: "kube-api-access-wjdrz") pod "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" (UID: "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758"). InnerVolumeSpecName "kube-api-access-wjdrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.505430 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fae91e8-30c8-4b1d-a243-c2cc58100766" path="/var/lib/kubelet/pods/9fae91e8-30c8-4b1d-a243-c2cc58100766/volumes" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.507149 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a35812c5-ffc7-4307-ab31-390c9ee39262" path="/var/lib/kubelet/pods/a35812c5-ffc7-4307-ab31-390c9ee39262/volumes" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.514940 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" (UID: "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.543621 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.550966 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data" (OuterVolumeSpecName: "config-data") pod "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" (UID: "8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.584752 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.584784 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjdrz\" (UniqueName: \"kubernetes.io/projected/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-kube-api-access-wjdrz\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.584795 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.584805 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.628438 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.689853 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b65b94-9415-4a0b-acb7-760a536d250a-logs\") pod \"98b65b94-9415-4a0b-acb7-760a536d250a\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.689905 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbwbz\" (UniqueName: \"kubernetes.io/projected/98b65b94-9415-4a0b-acb7-760a536d250a-kube-api-access-qbwbz\") pod \"98b65b94-9415-4a0b-acb7-760a536d250a\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.689951 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-combined-ca-bundle\") pod \"98b65b94-9415-4a0b-acb7-760a536d250a\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.690458 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data-custom\") pod \"98b65b94-9415-4a0b-acb7-760a536d250a\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.690503 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data\") pod \"98b65b94-9415-4a0b-acb7-760a536d250a\" (UID: \"98b65b94-9415-4a0b-acb7-760a536d250a\") " Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.693333 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98b65b94-9415-4a0b-acb7-760a536d250a-logs" (OuterVolumeSpecName: "logs") pod "98b65b94-9415-4a0b-acb7-760a536d250a" (UID: "98b65b94-9415-4a0b-acb7-760a536d250a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.695528 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "98b65b94-9415-4a0b-acb7-760a536d250a" (UID: "98b65b94-9415-4a0b-acb7-760a536d250a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.698153 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98b65b94-9415-4a0b-acb7-760a536d250a-kube-api-access-qbwbz" (OuterVolumeSpecName: "kube-api-access-qbwbz") pod "98b65b94-9415-4a0b-acb7-760a536d250a" (UID: "98b65b94-9415-4a0b-acb7-760a536d250a"). InnerVolumeSpecName "kube-api-access-qbwbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.714567 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98b65b94-9415-4a0b-acb7-760a536d250a" (UID: "98b65b94-9415-4a0b-acb7-760a536d250a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.747361 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data" (OuterVolumeSpecName: "config-data") pod "98b65b94-9415-4a0b-acb7-760a536d250a" (UID: "98b65b94-9415-4a0b-acb7-760a536d250a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.792262 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.792298 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.792312 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b65b94-9415-4a0b-acb7-760a536d250a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.792321 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbwbz\" (UniqueName: \"kubernetes.io/projected/98b65b94-9415-4a0b-acb7-760a536d250a-kube-api-access-qbwbz\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.792332 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b65b94-9415-4a0b-acb7-760a536d250a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.813449 4919 generic.go:334] "Generic (PLEG): container finished" podID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerID="d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd" exitCode=0 Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.813481 4919 generic.go:334] "Generic (PLEG): container finished" podID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerID="4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9" exitCode=143 Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.813498 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.813530 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" event={"ID":"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758","Type":"ContainerDied","Data":"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd"} Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.813555 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" event={"ID":"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758","Type":"ContainerDied","Data":"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9"} Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.813565 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84fbf8d4df-qnkcp" event={"ID":"8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758","Type":"ContainerDied","Data":"b0b21bd0053b8a93a7c1d7438be124b6f235afca536b61172ee196ec3cb303b9"} Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.813580 4919 scope.go:117] "RemoveContainer" containerID="d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.817769 4919 generic.go:334] "Generic (PLEG): container finished" podID="98b65b94-9415-4a0b-acb7-760a536d250a" containerID="699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0" exitCode=0 Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.817796 4919 generic.go:334] "Generic (PLEG): container finished" podID="98b65b94-9415-4a0b-acb7-760a536d250a" containerID="2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a" exitCode=143 Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.817816 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" event={"ID":"98b65b94-9415-4a0b-acb7-760a536d250a","Type":"ContainerDied","Data":"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0"} Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.817838 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" event={"ID":"98b65b94-9415-4a0b-acb7-760a536d250a","Type":"ContainerDied","Data":"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a"} Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.817849 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" event={"ID":"98b65b94-9415-4a0b-acb7-760a536d250a","Type":"ContainerDied","Data":"24977a6a6c7357fdad82fb65e7391073dfce84fa5961a206b3d149b7fa19023c"} Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.817894 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.846901 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.848851 4919 scope.go:117] "RemoveContainer" containerID="4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.867706 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-84fbf8d4df-qnkcp"] Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.872583 4919 scope.go:117] "RemoveContainer" containerID="d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd" Mar 10 22:12:43 crc kubenswrapper[4919]: E0310 22:12:43.873008 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd\": container with ID starting with d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd not found: ID does not exist" containerID="d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.873054 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd"} err="failed to get container status \"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd\": rpc error: code = NotFound desc = could not find container \"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd\": container with ID starting with d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd not found: ID does not exist" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.873075 4919 scope.go:117] "RemoveContainer" containerID="4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9" Mar 10 22:12:43 crc kubenswrapper[4919]: E0310 22:12:43.873378 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9\": container with ID starting with 4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9 not found: ID does not exist" containerID="4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.873417 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9"} err="failed to get container status \"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9\": rpc error: code = NotFound desc = could not find container \"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9\": container with ID starting with 4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9 not found: ID does not exist" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.873438 4919 scope.go:117] "RemoveContainer" containerID="d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.873635 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd"} err="failed to get container status \"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd\": rpc error: code = NotFound desc = could not find container \"d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd\": container with ID starting with d101e9f0fbfd844fd8b73dd71104ea0c4e1e50657946aabedfb17e88b7ff5ebd not found: ID does not exist" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.873650 4919 scope.go:117] "RemoveContainer" containerID="4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.873863 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9"} err="failed to get container status \"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9\": rpc error: code = NotFound desc = could not find container \"4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9\": container with ID starting with 4a9f702eb992a48dcfe542ee169fd0605840cc5c16a3c5a904798af649730bd9 not found: ID does not exist" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.873877 4919 scope.go:117] "RemoveContainer" containerID="699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.875756 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-84fbf8d4df-qnkcp"] Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.884104 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp"] Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.893081 4919 scope.go:117] "RemoveContainer" containerID="2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.922288 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7d5f6c4bb6-46xtp"] Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.946677 4919 scope.go:117] "RemoveContainer" containerID="699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0" Mar 10 22:12:43 crc kubenswrapper[4919]: E0310 22:12:43.949079 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0\": container with ID starting with 699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0 not found: ID does not exist" containerID="699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.949154 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0"} err="failed to get container status \"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0\": rpc error: code = NotFound desc = could not find container \"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0\": container with ID starting with 699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0 not found: ID does not exist" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.949185 4919 scope.go:117] "RemoveContainer" containerID="2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a" Mar 10 22:12:43 crc kubenswrapper[4919]: E0310 22:12:43.949639 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a\": container with ID starting with 2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a not found: ID does not exist" containerID="2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.949686 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a"} err="failed to get container status \"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a\": rpc error: code = NotFound desc = could not find container \"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a\": container with ID starting with 2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a not found: ID does not exist" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.949713 4919 scope.go:117] "RemoveContainer" containerID="699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.949952 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0"} err="failed to get container status \"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0\": rpc error: code = NotFound desc = could not find container \"699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0\": container with ID starting with 699650532c86c22ecd43cf7cdf8a1bcd6d10a6251704d029dbd5b36c62d55be0 not found: ID does not exist" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.949967 4919 scope.go:117] "RemoveContainer" containerID="2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.950159 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a"} err="failed to get container status \"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a\": rpc error: code = NotFound desc = could not find container \"2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a\": container with ID starting with 2451a9341a4cf0ceece5b1ab52b5749f44f7a15c49e2383672cdc1d20d6ac13a not found: ID does not exist" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.955439 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-59bc595969-d7r9w"] Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.956366 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-59bc595969-d7r9w" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-api" containerID="cri-o://157348cf321f1547509c6d04d11d59fea346018cf02810116f3231b189156e80" gracePeriod=30 Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.957727 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-59bc595969-d7r9w" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-httpd" containerID="cri-o://2888487e5f0936f95c545464ab474073718039ae8ed2a613fd2c6a1ac23b7e58" gracePeriod=30 Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.992834 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-846dbc6cd5-kg4kx"] Mar 10 22:12:43 crc kubenswrapper[4919]: E0310 22:12:43.993515 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" containerName="barbican-keystone-listener" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.993609 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" containerName="barbican-keystone-listener" Mar 10 22:12:43 crc kubenswrapper[4919]: E0310 22:12:43.993692 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerName="barbican-worker" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.993765 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerName="barbican-worker" Mar 10 22:12:43 crc kubenswrapper[4919]: E0310 22:12:43.993971 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" containerName="barbican-keystone-listener-log" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.994040 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" containerName="barbican-keystone-listener-log" Mar 10 22:12:43 crc kubenswrapper[4919]: E0310 22:12:43.994124 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerName="barbican-worker-log" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.994252 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerName="barbican-worker-log" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.994524 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" containerName="barbican-keystone-listener" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.994630 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" containerName="barbican-keystone-listener-log" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.995047 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerName="barbican-worker" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.995273 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" containerName="barbican-worker-log" Mar 10 22:12:43 crc kubenswrapper[4919]: I0310 22:12:43.997846 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.007783 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-846dbc6cd5-kg4kx"] Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.081620 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-59bc595969-d7r9w" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9696/\": read tcp 10.217.0.2:43624->10.217.0.154:9696: read: connection reset by peer" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.097251 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.097441 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl4jk\" (UniqueName: \"kubernetes.io/projected/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-kube-api-access-wl4jk\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.097561 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-public-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.097693 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-ovndb-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.097813 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-internal-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.097886 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-combined-ca-bundle\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.097958 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.199607 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.199674 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl4jk\" (UniqueName: \"kubernetes.io/projected/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-kube-api-access-wl4jk\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.199735 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-public-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.199771 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-ovndb-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.199841 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-internal-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.199860 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-combined-ca-bundle\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.199896 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.205266 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-internal-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.206212 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-ovndb-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.206705 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-public-tls-certs\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.206764 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.208545 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.208789 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-combined-ca-bundle\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.221915 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl4jk\" (UniqueName: \"kubernetes.io/projected/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-kube-api-access-wl4jk\") pod \"neutron-846dbc6cd5-kg4kx\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.360423 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.782045 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-846dbc6cd5-kg4kx"] Mar 10 22:12:44 crc kubenswrapper[4919]: W0310 22:12:44.784130 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a44bcbb_6e2e_48bb_b7a7_16a4e916001d.slice/crio-badff415a7c7b021a38389a544f86ea1d866544cb7edbb98599811085fafe6ad WatchSource:0}: Error finding container badff415a7c7b021a38389a544f86ea1d866544cb7edbb98599811085fafe6ad: Status 404 returned error can't find the container with id badff415a7c7b021a38389a544f86ea1d866544cb7edbb98599811085fafe6ad Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.826697 4919 generic.go:334] "Generic (PLEG): container finished" podID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerID="2888487e5f0936f95c545464ab474073718039ae8ed2a613fd2c6a1ac23b7e58" exitCode=0 Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.826780 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59bc595969-d7r9w" event={"ID":"d4216f40-ccfe-4c2e-8bd7-944a4413bc43","Type":"ContainerDied","Data":"2888487e5f0936f95c545464ab474073718039ae8ed2a613fd2c6a1ac23b7e58"} Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.828321 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846dbc6cd5-kg4kx" event={"ID":"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d","Type":"ContainerStarted","Data":"badff415a7c7b021a38389a544f86ea1d866544cb7edbb98599811085fafe6ad"} Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.832511 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerStarted","Data":"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31"} Mar 10 22:12:44 crc kubenswrapper[4919]: I0310 22:12:44.832534 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerStarted","Data":"634b78c28f4846a325a9aae464c2dd32d983187d7ff33ab00f3bf7afb4ee69b8"} Mar 10 22:12:45 crc kubenswrapper[4919]: I0310 22:12:45.504763 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758" path="/var/lib/kubelet/pods/8c3d8de2-078f-4fb9-a9b7-9c8b43fb5758/volumes" Mar 10 22:12:45 crc kubenswrapper[4919]: I0310 22:12:45.505678 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98b65b94-9415-4a0b-acb7-760a536d250a" path="/var/lib/kubelet/pods/98b65b94-9415-4a0b-acb7-760a536d250a/volumes" Mar 10 22:12:45 crc kubenswrapper[4919]: I0310 22:12:45.846992 4919 generic.go:334] "Generic (PLEG): container finished" podID="0376622a-15ed-42d8-98b9-ffa1138134ee" containerID="db27c753dfd8df1b990a2d95eeec2891aa5193e1c72a29e062a7115cc6c131ca" exitCode=0 Mar 10 22:12:45 crc kubenswrapper[4919]: I0310 22:12:45.847072 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7xmvf" event={"ID":"0376622a-15ed-42d8-98b9-ffa1138134ee","Type":"ContainerDied","Data":"db27c753dfd8df1b990a2d95eeec2891aa5193e1c72a29e062a7115cc6c131ca"} Mar 10 22:12:45 crc kubenswrapper[4919]: I0310 22:12:45.850353 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerStarted","Data":"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76"} Mar 10 22:12:45 crc kubenswrapper[4919]: I0310 22:12:45.852606 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846dbc6cd5-kg4kx" event={"ID":"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d","Type":"ContainerStarted","Data":"9cdb7599c01cdc95ab93aaa9cd850cf9d1c5bc23e81939b0310b3ed9a9214bc6"} Mar 10 22:12:45 crc kubenswrapper[4919]: I0310 22:12:45.852629 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846dbc6cd5-kg4kx" event={"ID":"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d","Type":"ContainerStarted","Data":"18dd8dafec6aecc8efed736cca0a71f9d1628505bf0335dc2c01e5a11aba23af"} Mar 10 22:12:45 crc kubenswrapper[4919]: I0310 22:12:45.853149 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:12:46 crc kubenswrapper[4919]: I0310 22:12:46.113701 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-59bc595969-d7r9w" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9696/\": dial tcp 10.217.0.154:9696: connect: connection refused" Mar 10 22:12:46 crc kubenswrapper[4919]: I0310 22:12:46.864216 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerStarted","Data":"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc"} Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.219828 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.248409 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-846dbc6cd5-kg4kx" podStartSLOduration=4.248379176 podStartE2EDuration="4.248379176s" podCreationTimestamp="2026-03-10 22:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:45.888600604 +0000 UTC m=+1353.130481242" watchObservedRunningTime="2026-03-10 22:12:47.248379176 +0000 UTC m=+1354.490259784" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.382601 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-scripts\") pod \"0376622a-15ed-42d8-98b9-ffa1138134ee\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.382678 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/0376622a-15ed-42d8-98b9-ffa1138134ee-kube-api-access-gkckt\") pod \"0376622a-15ed-42d8-98b9-ffa1138134ee\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.382735 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0376622a-15ed-42d8-98b9-ffa1138134ee-etc-machine-id\") pod \"0376622a-15ed-42d8-98b9-ffa1138134ee\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.382798 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-db-sync-config-data\") pod \"0376622a-15ed-42d8-98b9-ffa1138134ee\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.382822 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-config-data\") pod \"0376622a-15ed-42d8-98b9-ffa1138134ee\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.382928 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-combined-ca-bundle\") pod \"0376622a-15ed-42d8-98b9-ffa1138134ee\" (UID: \"0376622a-15ed-42d8-98b9-ffa1138134ee\") " Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.383617 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0376622a-15ed-42d8-98b9-ffa1138134ee-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0376622a-15ed-42d8-98b9-ffa1138134ee" (UID: "0376622a-15ed-42d8-98b9-ffa1138134ee"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.388267 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0376622a-15ed-42d8-98b9-ffa1138134ee" (UID: "0376622a-15ed-42d8-98b9-ffa1138134ee"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.389124 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0376622a-15ed-42d8-98b9-ffa1138134ee-kube-api-access-gkckt" (OuterVolumeSpecName: "kube-api-access-gkckt") pod "0376622a-15ed-42d8-98b9-ffa1138134ee" (UID: "0376622a-15ed-42d8-98b9-ffa1138134ee"). InnerVolumeSpecName "kube-api-access-gkckt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.389552 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-scripts" (OuterVolumeSpecName: "scripts") pod "0376622a-15ed-42d8-98b9-ffa1138134ee" (UID: "0376622a-15ed-42d8-98b9-ffa1138134ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.412288 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0376622a-15ed-42d8-98b9-ffa1138134ee" (UID: "0376622a-15ed-42d8-98b9-ffa1138134ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.444557 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-config-data" (OuterVolumeSpecName: "config-data") pod "0376622a-15ed-42d8-98b9-ffa1138134ee" (UID: "0376622a-15ed-42d8-98b9-ffa1138134ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.484547 4919 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0376622a-15ed-42d8-98b9-ffa1138134ee-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.484585 4919 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.484598 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.484611 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.484623 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0376622a-15ed-42d8-98b9-ffa1138134ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.484636 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkckt\" (UniqueName: \"kubernetes.io/projected/0376622a-15ed-42d8-98b9-ffa1138134ee-kube-api-access-gkckt\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.873437 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7xmvf" event={"ID":"0376622a-15ed-42d8-98b9-ffa1138134ee","Type":"ContainerDied","Data":"9d1e28e3d49d196291959241d948da3cfbdce4b0e3d699a85ba9a2bb6ef25c93"} Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.873480 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d1e28e3d49d196291959241d948da3cfbdce4b0e3d699a85ba9a2bb6ef25c93" Mar 10 22:12:47 crc kubenswrapper[4919]: I0310 22:12:47.873514 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7xmvf" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.276845 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-6whm5"] Mar 10 22:12:48 crc kubenswrapper[4919]: E0310 22:12:48.277375 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0376622a-15ed-42d8-98b9-ffa1138134ee" containerName="cinder-db-sync" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.277406 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0376622a-15ed-42d8-98b9-ffa1138134ee" containerName="cinder-db-sync" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.277623 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0376622a-15ed-42d8-98b9-ffa1138134ee" containerName="cinder-db-sync" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.278775 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.305957 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.306235 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-config\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.306387 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.306632 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-svc\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.306777 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-swift-storage-0\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.306940 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f89ck\" (UniqueName: \"kubernetes.io/projected/1fa63050-4c25-4c14-a046-6899bf0de3a0-kube-api-access-f89ck\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.337731 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-6whm5"] Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.350527 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.353811 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.361263 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.362532 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pxw4p" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.363020 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.371720 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.379646 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411478 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0533ad-16e6-40b1-be09-46a0d9d9f342-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411599 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411642 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-svc\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411681 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnfw\" (UniqueName: \"kubernetes.io/projected/3d0533ad-16e6-40b1-be09-46a0d9d9f342-kube-api-access-mwnfw\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411715 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-swift-storage-0\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411742 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411770 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f89ck\" (UniqueName: \"kubernetes.io/projected/1fa63050-4c25-4c14-a046-6899bf0de3a0-kube-api-access-f89ck\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411830 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411859 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-config\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411904 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411931 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.411976 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.413502 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.413875 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-svc\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.414637 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.415902 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-config\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.416936 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-swift-storage-0\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.443148 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f89ck\" (UniqueName: \"kubernetes.io/projected/1fa63050-4c25-4c14-a046-6899bf0de3a0-kube-api-access-f89ck\") pod \"dnsmasq-dns-5cbf7756bf-6whm5\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.496147 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.498744 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.504523 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.507247 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.513269 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514037 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514096 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514209 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75d98b79-f3d7-4522-a83b-bcc8023fb097-etc-machine-id\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514237 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-scripts\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514269 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0533ad-16e6-40b1-be09-46a0d9d9f342-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514291 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514418 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0533ad-16e6-40b1-be09-46a0d9d9f342-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514371 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnfw\" (UniqueName: \"kubernetes.io/projected/3d0533ad-16e6-40b1-be09-46a0d9d9f342-kube-api-access-mwnfw\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.514785 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75d98b79-f3d7-4522-a83b-bcc8023fb097-logs\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.515064 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.515102 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg55n\" (UniqueName: \"kubernetes.io/projected/75d98b79-f3d7-4522-a83b-bcc8023fb097-kube-api-access-cg55n\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.515215 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data-custom\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.515313 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.528023 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-scripts\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.533045 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.546785 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.555530 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnfw\" (UniqueName: \"kubernetes.io/projected/3d0533ad-16e6-40b1-be09-46a0d9d9f342-kube-api-access-mwnfw\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.556143 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.619876 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.619962 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75d98b79-f3d7-4522-a83b-bcc8023fb097-etc-machine-id\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.619978 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-scripts\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.620035 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75d98b79-f3d7-4522-a83b-bcc8023fb097-logs\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.620064 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg55n\" (UniqueName: \"kubernetes.io/projected/75d98b79-f3d7-4522-a83b-bcc8023fb097-kube-api-access-cg55n\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.620118 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data-custom\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.620154 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.626906 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75d98b79-f3d7-4522-a83b-bcc8023fb097-etc-machine-id\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.627558 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75d98b79-f3d7-4522-a83b-bcc8023fb097-logs\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.628346 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-scripts\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.631668 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.632000 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data-custom\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.635274 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.662178 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg55n\" (UniqueName: \"kubernetes.io/projected/75d98b79-f3d7-4522-a83b-bcc8023fb097-kube-api-access-cg55n\") pod \"cinder-api-0\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.727300 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.800568 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.887953 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.891471 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerStarted","Data":"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35"} Mar 10 22:12:48 crc kubenswrapper[4919]: I0310 22:12:48.892266 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.221913 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.853990902 podStartE2EDuration="7.221899306s" podCreationTimestamp="2026-03-10 22:12:42 +0000 UTC" firstStartedPulling="2026-03-10 22:12:43.855450398 +0000 UTC m=+1351.097331006" lastFinishedPulling="2026-03-10 22:12:48.223358792 +0000 UTC m=+1355.465239410" observedRunningTime="2026-03-10 22:12:48.914768595 +0000 UTC m=+1356.156649193" watchObservedRunningTime="2026-03-10 22:12:49.221899306 +0000 UTC m=+1356.463779914" Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.228082 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-6whm5"] Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.412698 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.605463 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:49 crc kubenswrapper[4919]: W0310 22:12:49.675170 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75d98b79_f3d7_4522_a83b_bcc8023fb097.slice/crio-a763b24ea8db26e6a0f2f3f3f4c659cb23b96f3ecd2a8f308490f5397e725111 WatchSource:0}: Error finding container a763b24ea8db26e6a0f2f3f3f4c659cb23b96f3ecd2a8f308490f5397e725111: Status 404 returned error can't find the container with id a763b24ea8db26e6a0f2f3f3f4c659cb23b96f3ecd2a8f308490f5397e725111 Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.913466 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d0533ad-16e6-40b1-be09-46a0d9d9f342","Type":"ContainerStarted","Data":"f82fb3596f35c7410c78a07021a98778bf2376d4d93dc7dacecc11fc54734dbb"} Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.916062 4919 generic.go:334] "Generic (PLEG): container finished" podID="1fa63050-4c25-4c14-a046-6899bf0de3a0" containerID="d613139a230ae8aa02383da619842787d20f3871a917a1d083fc34928c5e3425" exitCode=0 Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.916980 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" event={"ID":"1fa63050-4c25-4c14-a046-6899bf0de3a0","Type":"ContainerDied","Data":"d613139a230ae8aa02383da619842787d20f3871a917a1d083fc34928c5e3425"} Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.917080 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" event={"ID":"1fa63050-4c25-4c14-a046-6899bf0de3a0","Type":"ContainerStarted","Data":"54a5846f0762b119f28ccf760491f8a56eb41d52575a1e45f9339592a37c3c55"} Mar 10 22:12:49 crc kubenswrapper[4919]: I0310 22:12:49.921168 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"75d98b79-f3d7-4522-a83b-bcc8023fb097","Type":"ContainerStarted","Data":"a763b24ea8db26e6a0f2f3f3f4c659cb23b96f3ecd2a8f308490f5397e725111"} Mar 10 22:12:50 crc kubenswrapper[4919]: I0310 22:12:50.524502 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:50 crc kubenswrapper[4919]: I0310 22:12:50.943543 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" event={"ID":"1fa63050-4c25-4c14-a046-6899bf0de3a0","Type":"ContainerStarted","Data":"105c15b58bc8544c686157743a78596884bc9236556a606b96aa40fe00b37ce9"} Mar 10 22:12:50 crc kubenswrapper[4919]: I0310 22:12:50.943815 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:50 crc kubenswrapper[4919]: I0310 22:12:50.945528 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"75d98b79-f3d7-4522-a83b-bcc8023fb097","Type":"ContainerStarted","Data":"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090"} Mar 10 22:12:50 crc kubenswrapper[4919]: I0310 22:12:50.977843 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" podStartSLOduration=2.97782329 podStartE2EDuration="2.97782329s" podCreationTimestamp="2026-03-10 22:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:50.965769064 +0000 UTC m=+1358.207649672" watchObservedRunningTime="2026-03-10 22:12:50.97782329 +0000 UTC m=+1358.219703898" Mar 10 22:12:51 crc kubenswrapper[4919]: I0310 22:12:51.956763 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d0533ad-16e6-40b1-be09-46a0d9d9f342","Type":"ContainerStarted","Data":"b6c27257dbc2731eaf36852f0613861b336ec7cc0b8cbeeda0b7a99e38bbba56"} Mar 10 22:12:51 crc kubenswrapper[4919]: I0310 22:12:51.957007 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d0533ad-16e6-40b1-be09-46a0d9d9f342","Type":"ContainerStarted","Data":"085c0960e485c0602523fb339f7fbfe3db7d3b23dd248dd8b169e37d44f16776"} Mar 10 22:12:51 crc kubenswrapper[4919]: I0310 22:12:51.958986 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"75d98b79-f3d7-4522-a83b-bcc8023fb097","Type":"ContainerStarted","Data":"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c"} Mar 10 22:12:51 crc kubenswrapper[4919]: I0310 22:12:51.959180 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerName="cinder-api-log" containerID="cri-o://58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090" gracePeriod=30 Mar 10 22:12:51 crc kubenswrapper[4919]: I0310 22:12:51.959242 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerName="cinder-api" containerID="cri-o://b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c" gracePeriod=30 Mar 10 22:12:51 crc kubenswrapper[4919]: I0310 22:12:51.988902 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.131692509 podStartE2EDuration="3.988883543s" podCreationTimestamp="2026-03-10 22:12:48 +0000 UTC" firstStartedPulling="2026-03-10 22:12:49.41470067 +0000 UTC m=+1356.656581278" lastFinishedPulling="2026-03-10 22:12:50.271891704 +0000 UTC m=+1357.513772312" observedRunningTime="2026-03-10 22:12:51.983402505 +0000 UTC m=+1359.225283113" watchObservedRunningTime="2026-03-10 22:12:51.988883543 +0000 UTC m=+1359.230764151" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.010377 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.010357496 podStartE2EDuration="4.010357496s" podCreationTimestamp="2026-03-10 22:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:52.009729058 +0000 UTC m=+1359.251609666" watchObservedRunningTime="2026-03-10 22:12:52.010357496 +0000 UTC m=+1359.252238124" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.632907 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.729853 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75d98b79-f3d7-4522-a83b-bcc8023fb097-etc-machine-id\") pod \"75d98b79-f3d7-4522-a83b-bcc8023fb097\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.729921 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-scripts\") pod \"75d98b79-f3d7-4522-a83b-bcc8023fb097\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.729984 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg55n\" (UniqueName: \"kubernetes.io/projected/75d98b79-f3d7-4522-a83b-bcc8023fb097-kube-api-access-cg55n\") pod \"75d98b79-f3d7-4522-a83b-bcc8023fb097\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.730215 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-combined-ca-bundle\") pod \"75d98b79-f3d7-4522-a83b-bcc8023fb097\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.730243 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data\") pod \"75d98b79-f3d7-4522-a83b-bcc8023fb097\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.730275 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75d98b79-f3d7-4522-a83b-bcc8023fb097-logs\") pod \"75d98b79-f3d7-4522-a83b-bcc8023fb097\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.730304 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data-custom\") pod \"75d98b79-f3d7-4522-a83b-bcc8023fb097\" (UID: \"75d98b79-f3d7-4522-a83b-bcc8023fb097\") " Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.732059 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75d98b79-f3d7-4522-a83b-bcc8023fb097-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "75d98b79-f3d7-4522-a83b-bcc8023fb097" (UID: "75d98b79-f3d7-4522-a83b-bcc8023fb097"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.732308 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d98b79-f3d7-4522-a83b-bcc8023fb097-logs" (OuterVolumeSpecName: "logs") pod "75d98b79-f3d7-4522-a83b-bcc8023fb097" (UID: "75d98b79-f3d7-4522-a83b-bcc8023fb097"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.737292 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-scripts" (OuterVolumeSpecName: "scripts") pod "75d98b79-f3d7-4522-a83b-bcc8023fb097" (UID: "75d98b79-f3d7-4522-a83b-bcc8023fb097"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.737539 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "75d98b79-f3d7-4522-a83b-bcc8023fb097" (UID: "75d98b79-f3d7-4522-a83b-bcc8023fb097"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.737677 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d98b79-f3d7-4522-a83b-bcc8023fb097-kube-api-access-cg55n" (OuterVolumeSpecName: "kube-api-access-cg55n") pod "75d98b79-f3d7-4522-a83b-bcc8023fb097" (UID: "75d98b79-f3d7-4522-a83b-bcc8023fb097"). InnerVolumeSpecName "kube-api-access-cg55n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.765761 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75d98b79-f3d7-4522-a83b-bcc8023fb097" (UID: "75d98b79-f3d7-4522-a83b-bcc8023fb097"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.797171 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data" (OuterVolumeSpecName: "config-data") pod "75d98b79-f3d7-4522-a83b-bcc8023fb097" (UID: "75d98b79-f3d7-4522-a83b-bcc8023fb097"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.836608 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75d98b79-f3d7-4522-a83b-bcc8023fb097-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.836680 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.836694 4919 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/75d98b79-f3d7-4522-a83b-bcc8023fb097-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.836704 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.836713 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg55n\" (UniqueName: \"kubernetes.io/projected/75d98b79-f3d7-4522-a83b-bcc8023fb097-kube-api-access-cg55n\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.836723 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.836731 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d98b79-f3d7-4522-a83b-bcc8023fb097-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.976837 4919 generic.go:334] "Generic (PLEG): container finished" podID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerID="b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c" exitCode=0 Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.976869 4919 generic.go:334] "Generic (PLEG): container finished" podID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerID="58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090" exitCode=143 Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.976891 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.976943 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"75d98b79-f3d7-4522-a83b-bcc8023fb097","Type":"ContainerDied","Data":"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c"} Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.976996 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"75d98b79-f3d7-4522-a83b-bcc8023fb097","Type":"ContainerDied","Data":"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090"} Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.977009 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"75d98b79-f3d7-4522-a83b-bcc8023fb097","Type":"ContainerDied","Data":"a763b24ea8db26e6a0f2f3f3f4c659cb23b96f3ecd2a8f308490f5397e725111"} Mar 10 22:12:52 crc kubenswrapper[4919]: I0310 22:12:52.977026 4919 scope.go:117] "RemoveContainer" containerID="b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.008048 4919 scope.go:117] "RemoveContainer" containerID="58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.029354 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.030532 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.030692 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.052670 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.060683 4919 scope.go:117] "RemoveContainer" containerID="b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c" Mar 10 22:12:53 crc kubenswrapper[4919]: E0310 22:12:53.072940 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c\": container with ID starting with b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c not found: ID does not exist" containerID="b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.072981 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c"} err="failed to get container status \"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c\": rpc error: code = NotFound desc = could not find container \"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c\": container with ID starting with b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c not found: ID does not exist" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.073023 4919 scope.go:117] "RemoveContainer" containerID="58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090" Mar 10 22:12:53 crc kubenswrapper[4919]: E0310 22:12:53.074334 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090\": container with ID starting with 58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090 not found: ID does not exist" containerID="58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.074373 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090"} err="failed to get container status \"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090\": rpc error: code = NotFound desc = could not find container \"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090\": container with ID starting with 58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090 not found: ID does not exist" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.074414 4919 scope.go:117] "RemoveContainer" containerID="b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.076865 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c"} err="failed to get container status \"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c\": rpc error: code = NotFound desc = could not find container \"b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c\": container with ID starting with b1949be2adfe1af52b1a2c4416fb1195c8c3be0b846d72c9ed9981d96ff0346c not found: ID does not exist" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.076895 4919 scope.go:117] "RemoveContainer" containerID="58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.081502 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090"} err="failed to get container status \"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090\": rpc error: code = NotFound desc = could not find container \"58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090\": container with ID starting with 58833681c2f5c6623daa17cd3cfd9c80d0b03aef9146684d2a712d5ee6437090 not found: ID does not exist" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.081562 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:53 crc kubenswrapper[4919]: E0310 22:12:53.081882 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerName="cinder-api-log" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.081894 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerName="cinder-api-log" Mar 10 22:12:53 crc kubenswrapper[4919]: E0310 22:12:53.081907 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerName="cinder-api" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.081912 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerName="cinder-api" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.082094 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerName="cinder-api-log" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.082104 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" containerName="cinder-api" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.083008 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.087697 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.087884 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.088466 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.110139 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.142795 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4865c8ed-670d-41a0-b9fc-ba7697085e6b-logs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.142883 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.142931 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-scripts\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.142955 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.143009 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j46h\" (UniqueName: \"kubernetes.io/projected/4865c8ed-670d-41a0-b9fc-ba7697085e6b-kube-api-access-5j46h\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.143053 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.143082 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4865c8ed-670d-41a0-b9fc-ba7697085e6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.143151 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.143177 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244436 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4865c8ed-670d-41a0-b9fc-ba7697085e6b-logs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244502 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244534 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-scripts\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244549 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244586 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j46h\" (UniqueName: \"kubernetes.io/projected/4865c8ed-670d-41a0-b9fc-ba7697085e6b-kube-api-access-5j46h\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244614 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244636 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4865c8ed-670d-41a0-b9fc-ba7697085e6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244673 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.244688 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.246452 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4865c8ed-670d-41a0-b9fc-ba7697085e6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.246707 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4865c8ed-670d-41a0-b9fc-ba7697085e6b-logs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.253854 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.254645 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.254926 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.258042 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.266652 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-scripts\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.267966 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.272572 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j46h\" (UniqueName: \"kubernetes.io/projected/4865c8ed-670d-41a0-b9fc-ba7697085e6b-kube-api-access-5j46h\") pod \"cinder-api-0\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.408337 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.498470 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d98b79-f3d7-4522-a83b-bcc8023fb097" path="/var/lib/kubelet/pods/75d98b79-f3d7-4522-a83b-bcc8023fb097/volumes" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.801146 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.922747 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:12:53 crc kubenswrapper[4919]: I0310 22:12:53.999709 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4865c8ed-670d-41a0-b9fc-ba7697085e6b","Type":"ContainerStarted","Data":"1d98b5abcd7706e0887327cf5b568d96c7e0525ca9b5bfec7812cdf1a518c16e"} Mar 10 22:12:54 crc kubenswrapper[4919]: I0310 22:12:53.999773 4919 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 22:12:54 crc kubenswrapper[4919]: I0310 22:12:53.999799 4919 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 22:12:54 crc kubenswrapper[4919]: I0310 22:12:54.194149 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:54 crc kubenswrapper[4919]: I0310 22:12:54.498224 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:12:54 crc kubenswrapper[4919]: I0310 22:12:54.617317 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76f79c8d94-r2gfk"] Mar 10 22:12:54 crc kubenswrapper[4919]: I0310 22:12:54.619081 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76f79c8d94-r2gfk" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerName="placement-log" containerID="cri-o://b41b9dc117a4ea3985d0420d76edc4c3f9d048760b07300843dce013e6bbe4cc" gracePeriod=30 Mar 10 22:12:54 crc kubenswrapper[4919]: I0310 22:12:54.619537 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76f79c8d94-r2gfk" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerName="placement-api" containerID="cri-o://22f47a828dbe40df452a7740f55740269c4d5812ffec016b63b10dce5fbf14e6" gracePeriod=30 Mar 10 22:12:54 crc kubenswrapper[4919]: I0310 22:12:54.914928 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.012244 4919 generic.go:334] "Generic (PLEG): container finished" podID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerID="b41b9dc117a4ea3985d0420d76edc4c3f9d048760b07300843dce013e6bbe4cc" exitCode=143 Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.012310 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f79c8d94-r2gfk" event={"ID":"38fe889d-dbc7-448a-ade6-7847b16f85d2","Type":"ContainerDied","Data":"b41b9dc117a4ea3985d0420d76edc4c3f9d048760b07300843dce013e6bbe4cc"} Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.013963 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4865c8ed-670d-41a0-b9fc-ba7697085e6b","Type":"ContainerStarted","Data":"160abe754665fdfbb180f5bc071ae95530a567d07d30547bc0bedcf6ffce0c0d"} Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.015149 4919 generic.go:334] "Generic (PLEG): container finished" podID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerID="157348cf321f1547509c6d04d11d59fea346018cf02810116f3231b189156e80" exitCode=0 Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.015359 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59bc595969-d7r9w" event={"ID":"d4216f40-ccfe-4c2e-8bd7-944a4413bc43","Type":"ContainerDied","Data":"157348cf321f1547509c6d04d11d59fea346018cf02810116f3231b189156e80"} Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.396798 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.497850 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-httpd-config\") pod \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.497939 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-config\") pod \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.497983 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-internal-tls-certs\") pod \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.498031 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-combined-ca-bundle\") pod \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.498134 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-ovndb-tls-certs\") pod \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.498175 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-public-tls-certs\") pod \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.498205 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-622vg\" (UniqueName: \"kubernetes.io/projected/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-kube-api-access-622vg\") pod \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\" (UID: \"d4216f40-ccfe-4c2e-8bd7-944a4413bc43\") " Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.504900 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d4216f40-ccfe-4c2e-8bd7-944a4413bc43" (UID: "d4216f40-ccfe-4c2e-8bd7-944a4413bc43"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.508590 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-kube-api-access-622vg" (OuterVolumeSpecName: "kube-api-access-622vg") pod "d4216f40-ccfe-4c2e-8bd7-944a4413bc43" (UID: "d4216f40-ccfe-4c2e-8bd7-944a4413bc43"). InnerVolumeSpecName "kube-api-access-622vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.551759 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4216f40-ccfe-4c2e-8bd7-944a4413bc43" (UID: "d4216f40-ccfe-4c2e-8bd7-944a4413bc43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.554656 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d4216f40-ccfe-4c2e-8bd7-944a4413bc43" (UID: "d4216f40-ccfe-4c2e-8bd7-944a4413bc43"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.554717 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d4216f40-ccfe-4c2e-8bd7-944a4413bc43" (UID: "d4216f40-ccfe-4c2e-8bd7-944a4413bc43"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.555042 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-config" (OuterVolumeSpecName: "config") pod "d4216f40-ccfe-4c2e-8bd7-944a4413bc43" (UID: "d4216f40-ccfe-4c2e-8bd7-944a4413bc43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.571114 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d4216f40-ccfe-4c2e-8bd7-944a4413bc43" (UID: "d4216f40-ccfe-4c2e-8bd7-944a4413bc43"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.600798 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.600830 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.600840 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.600850 4919 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.600859 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.600868 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-622vg\" (UniqueName: \"kubernetes.io/projected/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-kube-api-access-622vg\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:55 crc kubenswrapper[4919]: I0310 22:12:55.600877 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d4216f40-ccfe-4c2e-8bd7-944a4413bc43-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.025776 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4865c8ed-670d-41a0-b9fc-ba7697085e6b","Type":"ContainerStarted","Data":"df8f79c23e11b14d9212f9cd7c7b374f297dc4c6a1b8f62a2988cc7af5ea3b27"} Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.027932 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59bc595969-d7r9w" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.031533 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59bc595969-d7r9w" event={"ID":"d4216f40-ccfe-4c2e-8bd7-944a4413bc43","Type":"ContainerDied","Data":"3a1810b5fc027a3dcb5f48d47380c1138bdf296bfc9b70fd8d05ce2643ba8249"} Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.031594 4919 scope.go:117] "RemoveContainer" containerID="2888487e5f0936f95c545464ab474073718039ae8ed2a613fd2c6a1ac23b7e58" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.054420 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.054376413 podStartE2EDuration="3.054376413s" podCreationTimestamp="2026-03-10 22:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:12:56.044713651 +0000 UTC m=+1363.286594259" watchObservedRunningTime="2026-03-10 22:12:56.054376413 +0000 UTC m=+1363.296257021" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.065131 4919 scope.go:117] "RemoveContainer" containerID="157348cf321f1547509c6d04d11d59fea346018cf02810116f3231b189156e80" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.071309 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-59bc595969-d7r9w"] Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.078741 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-59bc595969-d7r9w"] Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.446136 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 22:12:56 crc kubenswrapper[4919]: E0310 22:12:56.446671 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-api" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.446696 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-api" Mar 10 22:12:56 crc kubenswrapper[4919]: E0310 22:12:56.446712 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-httpd" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.446721 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-httpd" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.446941 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-api" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.446980 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" containerName="neutron-httpd" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.447908 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.451797 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.451825 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-vk2wx" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.452355 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.455089 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.618352 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-combined-ca-bundle\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.618521 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.618562 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx2vn\" (UniqueName: \"kubernetes.io/projected/26d75179-8f10-4b19-8fba-e341dda4db56-kube-api-access-sx2vn\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.618598 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config-secret\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.720323 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config-secret\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.720400 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-combined-ca-bundle\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.720553 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.720588 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx2vn\" (UniqueName: \"kubernetes.io/projected/26d75179-8f10-4b19-8fba-e341dda4db56-kube-api-access-sx2vn\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.721732 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.724889 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config-secret\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.729974 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-combined-ca-bundle\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.744235 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx2vn\" (UniqueName: \"kubernetes.io/projected/26d75179-8f10-4b19-8fba-e341dda4db56-kube-api-access-sx2vn\") pod \"openstackclient\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.763144 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.913368 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.924438 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.943893 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.945306 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:12:56 crc kubenswrapper[4919]: I0310 22:12:56.951238 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.039371 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.127478 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9fw\" (UniqueName: \"kubernetes.io/projected/6bff1404-f9b1-48f8-b093-95c3bb206c6a-kube-api-access-pc9fw\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.127781 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config-secret\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.127859 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.127930 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.229888 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9fw\" (UniqueName: \"kubernetes.io/projected/6bff1404-f9b1-48f8-b093-95c3bb206c6a-kube-api-access-pc9fw\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.230328 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config-secret\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.231373 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.231548 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.233514 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.235349 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config-secret\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.236565 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.256328 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9fw\" (UniqueName: \"kubernetes.io/projected/6bff1404-f9b1-48f8-b093-95c3bb206c6a-kube-api-access-pc9fw\") pod \"openstackclient\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.272471 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: E0310 22:12:57.304773 4919 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 22:12:57 crc kubenswrapper[4919]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_26d75179-8f10-4b19-8fba-e341dda4db56_0(5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd" Netns:"/var/run/netns/e6215548-528a-4e91-bae6-95460015bc72" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd;K8S_POD_UID=26d75179-8f10-4b19-8fba-e341dda4db56" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/26d75179-8f10-4b19-8fba-e341dda4db56:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd network default NAD default] [openstack/openstackclient 5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:ae [10.217.0.174/23] Mar 10 22:12:57 crc kubenswrapper[4919]: ' Mar 10 22:12:57 crc kubenswrapper[4919]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 22:12:57 crc kubenswrapper[4919]: > Mar 10 22:12:57 crc kubenswrapper[4919]: E0310 22:12:57.304839 4919 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 22:12:57 crc kubenswrapper[4919]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_26d75179-8f10-4b19-8fba-e341dda4db56_0(5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd" Netns:"/var/run/netns/e6215548-528a-4e91-bae6-95460015bc72" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd;K8S_POD_UID=26d75179-8f10-4b19-8fba-e341dda4db56" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/26d75179-8f10-4b19-8fba-e341dda4db56:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd network default NAD default] [openstack/openstackclient 5653ab63f11614a9126f0f6e88668d58cb79ee6d87b72ffdb902cc53a27ecabd network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:ae [10.217.0.174/23] Mar 10 22:12:57 crc kubenswrapper[4919]: ' Mar 10 22:12:57 crc kubenswrapper[4919]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 22:12:57 crc kubenswrapper[4919]: > pod="openstack/openstackclient" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.494614 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4216f40-ccfe-4c2e-8bd7-944a4413bc43" path="/var/lib/kubelet/pods/d4216f40-ccfe-4c2e-8bd7-944a4413bc43/volumes" Mar 10 22:12:57 crc kubenswrapper[4919]: I0310 22:12:57.745097 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.052513 4919 generic.go:334] "Generic (PLEG): container finished" podID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerID="22f47a828dbe40df452a7740f55740269c4d5812ffec016b63b10dce5fbf14e6" exitCode=0 Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.052585 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f79c8d94-r2gfk" event={"ID":"38fe889d-dbc7-448a-ade6-7847b16f85d2","Type":"ContainerDied","Data":"22f47a828dbe40df452a7740f55740269c4d5812ffec016b63b10dce5fbf14e6"} Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.055110 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6bff1404-f9b1-48f8-b093-95c3bb206c6a","Type":"ContainerStarted","Data":"d62a65b1fb2bd8e8bd60a908e5c5d547278c2538339919d18aefe331818ce77a"} Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.055165 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.150327 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.154342 4919 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="26d75179-8f10-4b19-8fba-e341dda4db56" podUID="6bff1404-f9b1-48f8-b093-95c3bb206c6a" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.157696 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.246652 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx2vn\" (UniqueName: \"kubernetes.io/projected/26d75179-8f10-4b19-8fba-e341dda4db56-kube-api-access-sx2vn\") pod \"26d75179-8f10-4b19-8fba-e341dda4db56\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.246705 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38fe889d-dbc7-448a-ade6-7847b16f85d2-logs\") pod \"38fe889d-dbc7-448a-ade6-7847b16f85d2\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.246770 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-config-data\") pod \"38fe889d-dbc7-448a-ade6-7847b16f85d2\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.246831 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx85f\" (UniqueName: \"kubernetes.io/projected/38fe889d-dbc7-448a-ade6-7847b16f85d2-kube-api-access-sx85f\") pod \"38fe889d-dbc7-448a-ade6-7847b16f85d2\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.246850 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config-secret\") pod \"26d75179-8f10-4b19-8fba-e341dda4db56\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.246873 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-combined-ca-bundle\") pod \"26d75179-8f10-4b19-8fba-e341dda4db56\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.246915 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-internal-tls-certs\") pod \"38fe889d-dbc7-448a-ade6-7847b16f85d2\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.246982 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config\") pod \"26d75179-8f10-4b19-8fba-e341dda4db56\" (UID: \"26d75179-8f10-4b19-8fba-e341dda4db56\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.247002 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-scripts\") pod \"38fe889d-dbc7-448a-ade6-7847b16f85d2\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.247025 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-public-tls-certs\") pod \"38fe889d-dbc7-448a-ade6-7847b16f85d2\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.247099 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-combined-ca-bundle\") pod \"38fe889d-dbc7-448a-ade6-7847b16f85d2\" (UID: \"38fe889d-dbc7-448a-ade6-7847b16f85d2\") " Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.247567 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fe889d-dbc7-448a-ade6-7847b16f85d2-logs" (OuterVolumeSpecName: "logs") pod "38fe889d-dbc7-448a-ade6-7847b16f85d2" (UID: "38fe889d-dbc7-448a-ade6-7847b16f85d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.252283 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "26d75179-8f10-4b19-8fba-e341dda4db56" (UID: "26d75179-8f10-4b19-8fba-e341dda4db56"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.252529 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38fe889d-dbc7-448a-ade6-7847b16f85d2-kube-api-access-sx85f" (OuterVolumeSpecName: "kube-api-access-sx85f") pod "38fe889d-dbc7-448a-ade6-7847b16f85d2" (UID: "38fe889d-dbc7-448a-ade6-7847b16f85d2"). InnerVolumeSpecName "kube-api-access-sx85f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.253481 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d75179-8f10-4b19-8fba-e341dda4db56-kube-api-access-sx2vn" (OuterVolumeSpecName: "kube-api-access-sx2vn") pod "26d75179-8f10-4b19-8fba-e341dda4db56" (UID: "26d75179-8f10-4b19-8fba-e341dda4db56"). InnerVolumeSpecName "kube-api-access-sx2vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.253642 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "26d75179-8f10-4b19-8fba-e341dda4db56" (UID: "26d75179-8f10-4b19-8fba-e341dda4db56"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.256564 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26d75179-8f10-4b19-8fba-e341dda4db56" (UID: "26d75179-8f10-4b19-8fba-e341dda4db56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.257551 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-scripts" (OuterVolumeSpecName: "scripts") pod "38fe889d-dbc7-448a-ade6-7847b16f85d2" (UID: "38fe889d-dbc7-448a-ade6-7847b16f85d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.313650 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-config-data" (OuterVolumeSpecName: "config-data") pod "38fe889d-dbc7-448a-ade6-7847b16f85d2" (UID: "38fe889d-dbc7-448a-ade6-7847b16f85d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.324638 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38fe889d-dbc7-448a-ade6-7847b16f85d2" (UID: "38fe889d-dbc7-448a-ade6-7847b16f85d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349648 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349684 4919 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349696 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349709 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349719 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx2vn\" (UniqueName: \"kubernetes.io/projected/26d75179-8f10-4b19-8fba-e341dda4db56-kube-api-access-sx2vn\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349733 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38fe889d-dbc7-448a-ade6-7847b16f85d2-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349743 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349754 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx85f\" (UniqueName: \"kubernetes.io/projected/38fe889d-dbc7-448a-ade6-7847b16f85d2-kube-api-access-sx85f\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.349765 4919 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26d75179-8f10-4b19-8fba-e341dda4db56-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.353048 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "38fe889d-dbc7-448a-ade6-7847b16f85d2" (UID: "38fe889d-dbc7-448a-ade6-7847b16f85d2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.364999 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "38fe889d-dbc7-448a-ade6-7847b16f85d2" (UID: "38fe889d-dbc7-448a-ade6-7847b16f85d2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.454144 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.454241 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38fe889d-dbc7-448a-ade6-7847b16f85d2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.729643 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.827407 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98cfc95fc-fjth4"] Mar 10 22:12:58 crc kubenswrapper[4919]: I0310 22:12:58.827612 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" podUID="0c6e8e16-fa38-44f1-8a47-c6130972b034" containerName="dnsmasq-dns" containerID="cri-o://066e5c6c728a8a11a1ef807f6807f233d6540526942d3d96b951887a7e7d7b9b" gracePeriod=10 Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.046731 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.076136 4919 generic.go:334] "Generic (PLEG): container finished" podID="0c6e8e16-fa38-44f1-8a47-c6130972b034" containerID="066e5c6c728a8a11a1ef807f6807f233d6540526942d3d96b951887a7e7d7b9b" exitCode=0 Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.076238 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" event={"ID":"0c6e8e16-fa38-44f1-8a47-c6130972b034","Type":"ContainerDied","Data":"066e5c6c728a8a11a1ef807f6807f233d6540526942d3d96b951887a7e7d7b9b"} Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.082864 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.082876 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f79c8d94-r2gfk" event={"ID":"38fe889d-dbc7-448a-ade6-7847b16f85d2","Type":"ContainerDied","Data":"940b6f73c8166172de358abfcb1a48685b9ca01570f07afeae481190ca086271"} Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.083120 4919 scope.go:117] "RemoveContainer" containerID="22f47a828dbe40df452a7740f55740269c4d5812ffec016b63b10dce5fbf14e6" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.083171 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f79c8d94-r2gfk" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.106947 4919 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="26d75179-8f10-4b19-8fba-e341dda4db56" podUID="6bff1404-f9b1-48f8-b093-95c3bb206c6a" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.117972 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.118237 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerName="cinder-scheduler" containerID="cri-o://085c0960e485c0602523fb339f7fbfe3db7d3b23dd248dd8b169e37d44f16776" gracePeriod=30 Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.118416 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerName="probe" containerID="cri-o://b6c27257dbc2731eaf36852f0613861b336ec7cc0b8cbeeda0b7a99e38bbba56" gracePeriod=30 Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.140914 4919 scope.go:117] "RemoveContainer" containerID="b41b9dc117a4ea3985d0420d76edc4c3f9d048760b07300843dce013e6bbe4cc" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.143332 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76f79c8d94-r2gfk"] Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.150693 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76f79c8d94-r2gfk"] Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.175529 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.175853 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.175903 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.176643 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1dccae4c12e9eba18bc8d7756e50538a70d75c0bc02ce7c79c284d496783301e"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.176717 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://1dccae4c12e9eba18bc8d7756e50538a70d75c0bc02ce7c79c284d496783301e" gracePeriod=600 Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.424577 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.429876 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp8gq\" (UniqueName: \"kubernetes.io/projected/0c6e8e16-fa38-44f1-8a47-c6130972b034-kube-api-access-kp8gq\") pod \"0c6e8e16-fa38-44f1-8a47-c6130972b034\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.429982 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-nb\") pod \"0c6e8e16-fa38-44f1-8a47-c6130972b034\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.430080 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-sb\") pod \"0c6e8e16-fa38-44f1-8a47-c6130972b034\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.430152 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-config\") pod \"0c6e8e16-fa38-44f1-8a47-c6130972b034\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.430172 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-svc\") pod \"0c6e8e16-fa38-44f1-8a47-c6130972b034\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.430195 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-swift-storage-0\") pod \"0c6e8e16-fa38-44f1-8a47-c6130972b034\" (UID: \"0c6e8e16-fa38-44f1-8a47-c6130972b034\") " Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.442488 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6e8e16-fa38-44f1-8a47-c6130972b034-kube-api-access-kp8gq" (OuterVolumeSpecName: "kube-api-access-kp8gq") pod "0c6e8e16-fa38-44f1-8a47-c6130972b034" (UID: "0c6e8e16-fa38-44f1-8a47-c6130972b034"). InnerVolumeSpecName "kube-api-access-kp8gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.494976 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d75179-8f10-4b19-8fba-e341dda4db56" path="/var/lib/kubelet/pods/26d75179-8f10-4b19-8fba-e341dda4db56/volumes" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.496472 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" path="/var/lib/kubelet/pods/38fe889d-dbc7-448a-ade6-7847b16f85d2/volumes" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.532766 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp8gq\" (UniqueName: \"kubernetes.io/projected/0c6e8e16-fa38-44f1-8a47-c6130972b034-kube-api-access-kp8gq\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.536353 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0c6e8e16-fa38-44f1-8a47-c6130972b034" (UID: "0c6e8e16-fa38-44f1-8a47-c6130972b034"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.548202 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0c6e8e16-fa38-44f1-8a47-c6130972b034" (UID: "0c6e8e16-fa38-44f1-8a47-c6130972b034"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.551174 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-config" (OuterVolumeSpecName: "config") pod "0c6e8e16-fa38-44f1-8a47-c6130972b034" (UID: "0c6e8e16-fa38-44f1-8a47-c6130972b034"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.554932 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0c6e8e16-fa38-44f1-8a47-c6130972b034" (UID: "0c6e8e16-fa38-44f1-8a47-c6130972b034"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.560587 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c6e8e16-fa38-44f1-8a47-c6130972b034" (UID: "0c6e8e16-fa38-44f1-8a47-c6130972b034"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.636328 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.636361 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.636370 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.636381 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.636404 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c6e8e16-fa38-44f1-8a47-c6130972b034-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.839332 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-fbf4c94d9-4mg9b"] Mar 10 22:12:59 crc kubenswrapper[4919]: E0310 22:12:59.839694 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerName="placement-log" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.839711 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerName="placement-log" Mar 10 22:12:59 crc kubenswrapper[4919]: E0310 22:12:59.839733 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6e8e16-fa38-44f1-8a47-c6130972b034" containerName="dnsmasq-dns" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.839740 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6e8e16-fa38-44f1-8a47-c6130972b034" containerName="dnsmasq-dns" Mar 10 22:12:59 crc kubenswrapper[4919]: E0310 22:12:59.839755 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerName="placement-api" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.839761 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerName="placement-api" Mar 10 22:12:59 crc kubenswrapper[4919]: E0310 22:12:59.839774 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6e8e16-fa38-44f1-8a47-c6130972b034" containerName="init" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.839781 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6e8e16-fa38-44f1-8a47-c6130972b034" containerName="init" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.839937 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerName="placement-log" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.839949 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6e8e16-fa38-44f1-8a47-c6130972b034" containerName="dnsmasq-dns" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.839960 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="38fe889d-dbc7-448a-ade6-7847b16f85d2" containerName="placement-api" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.840900 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.843135 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.843561 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.843771 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.864865 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-fbf4c94d9-4mg9b"] Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.942564 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-run-httpd\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.942737 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-etc-swift\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.942836 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-config-data\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.943078 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-public-tls-certs\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.943108 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-log-httpd\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.943129 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-internal-tls-certs\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.943186 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-combined-ca-bundle\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:12:59 crc kubenswrapper[4919]: I0310 22:12:59.943231 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mrb\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-kube-api-access-b4mrb\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.048081 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-public-tls-certs\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.048609 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-log-httpd\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.048663 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-internal-tls-certs\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.048741 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-combined-ca-bundle\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.048804 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mrb\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-kube-api-access-b4mrb\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.048916 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-run-httpd\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.049141 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-etc-swift\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.049193 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-config-data\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.053807 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-log-httpd\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.054120 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-run-httpd\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.055482 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-public-tls-certs\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.058968 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-etc-swift\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.059129 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-config-data\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.060460 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-internal-tls-certs\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.063817 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-combined-ca-bundle\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.074296 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mrb\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-kube-api-access-b4mrb\") pod \"swift-proxy-fbf4c94d9-4mg9b\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.107093 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="1dccae4c12e9eba18bc8d7756e50538a70d75c0bc02ce7c79c284d496783301e" exitCode=0 Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.107188 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"1dccae4c12e9eba18bc8d7756e50538a70d75c0bc02ce7c79c284d496783301e"} Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.107221 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637"} Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.107239 4919 scope.go:117] "RemoveContainer" containerID="fe6790b4b646495ea90afaa8908c36e512ca4c07ed60f10561e041c0f1b0c857" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.115707 4919 generic.go:334] "Generic (PLEG): container finished" podID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerID="b6c27257dbc2731eaf36852f0613861b336ec7cc0b8cbeeda0b7a99e38bbba56" exitCode=0 Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.115818 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d0533ad-16e6-40b1-be09-46a0d9d9f342","Type":"ContainerDied","Data":"b6c27257dbc2731eaf36852f0613861b336ec7cc0b8cbeeda0b7a99e38bbba56"} Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.124746 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" event={"ID":"0c6e8e16-fa38-44f1-8a47-c6130972b034","Type":"ContainerDied","Data":"45f0b320c64ae1c5779b69ab4ab418b0bbf5f4d5d3cd6e519adaaafacf60de8d"} Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.124808 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98cfc95fc-fjth4" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.171707 4919 scope.go:117] "RemoveContainer" containerID="066e5c6c728a8a11a1ef807f6807f233d6540526942d3d96b951887a7e7d7b9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.175586 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98cfc95fc-fjth4"] Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.189146 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98cfc95fc-fjth4"] Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.196175 4919 scope.go:117] "RemoveContainer" containerID="8a2d912a5fd8f12859d8831e07c924875a58f0d1849dd7977578ff8fe68896de" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.202600 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.820735 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-fbf4c94d9-4mg9b"] Mar 10 22:13:00 crc kubenswrapper[4919]: W0310 22:13:00.830759 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f89c3b_5242_409b_a318_5b69410e9680.slice/crio-e153a03c272aa21dca94c411ddadd23c66e624d1242061994242ef8efc636065 WatchSource:0}: Error finding container e153a03c272aa21dca94c411ddadd23c66e624d1242061994242ef8efc636065: Status 404 returned error can't find the container with id e153a03c272aa21dca94c411ddadd23c66e624d1242061994242ef8efc636065 Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.931866 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.932317 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="ceilometer-central-agent" containerID="cri-o://9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31" gracePeriod=30 Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.933018 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="sg-core" containerID="cri-o://2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc" gracePeriod=30 Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.933165 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="proxy-httpd" containerID="cri-o://334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35" gracePeriod=30 Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.933236 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="ceilometer-notification-agent" containerID="cri-o://9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76" gracePeriod=30 Mar 10 22:13:00 crc kubenswrapper[4919]: I0310 22:13:00.957340 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": EOF" Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.148915 4919 generic.go:334] "Generic (PLEG): container finished" podID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerID="334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35" exitCode=0 Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.149217 4919 generic.go:334] "Generic (PLEG): container finished" podID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerID="2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc" exitCode=2 Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.149253 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerDied","Data":"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35"} Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.149278 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerDied","Data":"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc"} Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.151046 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" event={"ID":"d0f89c3b-5242-409b-a318-5b69410e9680","Type":"ContainerStarted","Data":"e153a03c272aa21dca94c411ddadd23c66e624d1242061994242ef8efc636065"} Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.502453 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6e8e16-fa38-44f1-8a47-c6130972b034" path="/var/lib/kubelet/pods/0c6e8e16-fa38-44f1-8a47-c6130972b034/volumes" Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.696944 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.896745 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-config-data\") pod \"481bd64a-9301-42fa-aa9a-7f6c378917df\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.897130 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vf7f\" (UniqueName: \"kubernetes.io/projected/481bd64a-9301-42fa-aa9a-7f6c378917df-kube-api-access-2vf7f\") pod \"481bd64a-9301-42fa-aa9a-7f6c378917df\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.897171 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-sg-core-conf-yaml\") pod \"481bd64a-9301-42fa-aa9a-7f6c378917df\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.897227 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-combined-ca-bundle\") pod \"481bd64a-9301-42fa-aa9a-7f6c378917df\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.897249 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-run-httpd\") pod \"481bd64a-9301-42fa-aa9a-7f6c378917df\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.897307 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-log-httpd\") pod \"481bd64a-9301-42fa-aa9a-7f6c378917df\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.897331 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-scripts\") pod \"481bd64a-9301-42fa-aa9a-7f6c378917df\" (UID: \"481bd64a-9301-42fa-aa9a-7f6c378917df\") " Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.897923 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "481bd64a-9301-42fa-aa9a-7f6c378917df" (UID: "481bd64a-9301-42fa-aa9a-7f6c378917df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.898362 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.899103 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "481bd64a-9301-42fa-aa9a-7f6c378917df" (UID: "481bd64a-9301-42fa-aa9a-7f6c378917df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.906301 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-scripts" (OuterVolumeSpecName: "scripts") pod "481bd64a-9301-42fa-aa9a-7f6c378917df" (UID: "481bd64a-9301-42fa-aa9a-7f6c378917df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.910641 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481bd64a-9301-42fa-aa9a-7f6c378917df-kube-api-access-2vf7f" (OuterVolumeSpecName: "kube-api-access-2vf7f") pod "481bd64a-9301-42fa-aa9a-7f6c378917df" (UID: "481bd64a-9301-42fa-aa9a-7f6c378917df"). InnerVolumeSpecName "kube-api-access-2vf7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:01 crc kubenswrapper[4919]: I0310 22:13:01.949565 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "481bd64a-9301-42fa-aa9a-7f6c378917df" (UID: "481bd64a-9301-42fa-aa9a-7f6c378917df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.000435 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vf7f\" (UniqueName: \"kubernetes.io/projected/481bd64a-9301-42fa-aa9a-7f6c378917df-kube-api-access-2vf7f\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.000478 4919 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.000494 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/481bd64a-9301-42fa-aa9a-7f6c378917df-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.000506 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.026187 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "481bd64a-9301-42fa-aa9a-7f6c378917df" (UID: "481bd64a-9301-42fa-aa9a-7f6c378917df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.043612 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-config-data" (OuterVolumeSpecName: "config-data") pod "481bd64a-9301-42fa-aa9a-7f6c378917df" (UID: "481bd64a-9301-42fa-aa9a-7f6c378917df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.101428 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.101684 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481bd64a-9301-42fa-aa9a-7f6c378917df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118169 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-pp7db"] Mar 10 22:13:02 crc kubenswrapper[4919]: E0310 22:13:02.118529 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="proxy-httpd" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118541 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="proxy-httpd" Mar 10 22:13:02 crc kubenswrapper[4919]: E0310 22:13:02.118554 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="ceilometer-central-agent" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118560 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="ceilometer-central-agent" Mar 10 22:13:02 crc kubenswrapper[4919]: E0310 22:13:02.118572 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="ceilometer-notification-agent" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118578 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="ceilometer-notification-agent" Mar 10 22:13:02 crc kubenswrapper[4919]: E0310 22:13:02.118599 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="sg-core" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118604 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="sg-core" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118781 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="ceilometer-notification-agent" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118799 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="sg-core" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118812 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="ceilometer-central-agent" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.118821 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerName="proxy-httpd" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.119309 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.137441 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pp7db"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.175508 4919 generic.go:334] "Generic (PLEG): container finished" podID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerID="9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76" exitCode=0 Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.176482 4919 generic.go:334] "Generic (PLEG): container finished" podID="481bd64a-9301-42fa-aa9a-7f6c378917df" containerID="9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31" exitCode=0 Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.175945 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.175924 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerDied","Data":"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76"} Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.176852 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerDied","Data":"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31"} Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.176868 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"481bd64a-9301-42fa-aa9a-7f6c378917df","Type":"ContainerDied","Data":"634b78c28f4846a325a9aae464c2dd32d983187d7ff33ab00f3bf7afb4ee69b8"} Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.176887 4919 scope.go:117] "RemoveContainer" containerID="334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.191480 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" event={"ID":"d0f89c3b-5242-409b-a318-5b69410e9680","Type":"ContainerStarted","Data":"64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3"} Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.191533 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" event={"ID":"d0f89c3b-5242-409b-a318-5b69410e9680","Type":"ContainerStarted","Data":"704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5"} Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.192813 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.192849 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.203250 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d891cb6e-7d23-40d0-9fd4-28ab980f207c-operator-scripts\") pod \"nova-api-db-create-pp7db\" (UID: \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\") " pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.203324 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqlsm\" (UniqueName: \"kubernetes.io/projected/d891cb6e-7d23-40d0-9fd4-28ab980f207c-kube-api-access-dqlsm\") pod \"nova-api-db-create-pp7db\" (UID: \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\") " pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.222913 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-8585d"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.224527 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.234590 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" podStartSLOduration=3.234570747 podStartE2EDuration="3.234570747s" podCreationTimestamp="2026-03-10 22:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:02.216109177 +0000 UTC m=+1369.457989805" watchObservedRunningTime="2026-03-10 22:13:02.234570747 +0000 UTC m=+1369.476451345" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.245763 4919 scope.go:117] "RemoveContainer" containerID="2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.251516 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cdcf-account-create-update-t7ns5"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.253198 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.261482 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8585d"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.261871 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.264905 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.300805 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.305526 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d891cb6e-7d23-40d0-9fd4-28ab980f207c-operator-scripts\") pod \"nova-api-db-create-pp7db\" (UID: \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\") " pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.305562 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fdec3e1-893d-44ec-bd70-90c66e304ba7-operator-scripts\") pod \"nova-api-cdcf-account-create-update-t7ns5\" (UID: \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\") " pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.305598 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqlsm\" (UniqueName: \"kubernetes.io/projected/d891cb6e-7d23-40d0-9fd4-28ab980f207c-kube-api-access-dqlsm\") pod \"nova-api-db-create-pp7db\" (UID: \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\") " pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.305689 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjfw8\" (UniqueName: \"kubernetes.io/projected/7fdec3e1-893d-44ec-bd70-90c66e304ba7-kube-api-access-vjfw8\") pod \"nova-api-cdcf-account-create-update-t7ns5\" (UID: \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\") " pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.307340 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-t7ns5"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.307978 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d891cb6e-7d23-40d0-9fd4-28ab980f207c-operator-scripts\") pod \"nova-api-db-create-pp7db\" (UID: \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\") " pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.310720 4919 scope.go:117] "RemoveContainer" containerID="9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.325904 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.330652 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.331910 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqlsm\" (UniqueName: \"kubernetes.io/projected/d891cb6e-7d23-40d0-9fd4-28ab980f207c-kube-api-access-dqlsm\") pod \"nova-api-db-create-pp7db\" (UID: \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\") " pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.342877 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.351354 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.351634 4919 scope.go:117] "RemoveContainer" containerID="9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.354533 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.402098 4919 scope.go:117] "RemoveContainer" containerID="334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35" Mar 10 22:13:02 crc kubenswrapper[4919]: E0310 22:13:02.402538 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35\": container with ID starting with 334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35 not found: ID does not exist" containerID="334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.402563 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35"} err="failed to get container status \"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35\": rpc error: code = NotFound desc = could not find container \"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35\": container with ID starting with 334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35 not found: ID does not exist" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.402583 4919 scope.go:117] "RemoveContainer" containerID="2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc" Mar 10 22:13:02 crc kubenswrapper[4919]: E0310 22:13:02.402770 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc\": container with ID starting with 2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc not found: ID does not exist" containerID="2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.402794 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc"} err="failed to get container status \"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc\": rpc error: code = NotFound desc = could not find container \"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc\": container with ID starting with 2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc not found: ID does not exist" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.402806 4919 scope.go:117] "RemoveContainer" containerID="9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76" Mar 10 22:13:02 crc kubenswrapper[4919]: E0310 22:13:02.403406 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76\": container with ID starting with 9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76 not found: ID does not exist" containerID="9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.403430 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76"} err="failed to get container status \"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76\": rpc error: code = NotFound desc = could not find container \"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76\": container with ID starting with 9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76 not found: ID does not exist" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.403444 4919 scope.go:117] "RemoveContainer" containerID="9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31" Mar 10 22:13:02 crc kubenswrapper[4919]: E0310 22:13:02.406533 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31\": container with ID starting with 9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31 not found: ID does not exist" containerID="9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.406567 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31"} err="failed to get container status \"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31\": rpc error: code = NotFound desc = could not find container \"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31\": container with ID starting with 9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31 not found: ID does not exist" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.406585 4919 scope.go:117] "RemoveContainer" containerID="334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.407593 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz9pl\" (UniqueName: \"kubernetes.io/projected/765c20cf-cede-45c6-867e-c3fa0749238d-kube-api-access-cz9pl\") pod \"nova-cell0-db-create-8585d\" (UID: \"765c20cf-cede-45c6-867e-c3fa0749238d\") " pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.407668 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjfw8\" (UniqueName: \"kubernetes.io/projected/7fdec3e1-893d-44ec-bd70-90c66e304ba7-kube-api-access-vjfw8\") pod \"nova-api-cdcf-account-create-update-t7ns5\" (UID: \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\") " pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.407719 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fdec3e1-893d-44ec-bd70-90c66e304ba7-operator-scripts\") pod \"nova-api-cdcf-account-create-update-t7ns5\" (UID: \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\") " pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.407745 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765c20cf-cede-45c6-867e-c3fa0749238d-operator-scripts\") pod \"nova-cell0-db-create-8585d\" (UID: \"765c20cf-cede-45c6-867e-c3fa0749238d\") " pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.409263 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fdec3e1-893d-44ec-bd70-90c66e304ba7-operator-scripts\") pod \"nova-api-cdcf-account-create-update-t7ns5\" (UID: \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\") " pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.409663 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35"} err="failed to get container status \"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35\": rpc error: code = NotFound desc = could not find container \"334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35\": container with ID starting with 334b53fdb906c5c53be43712e4a21afe588a10f2198bd1d297c8e69a8c02cc35 not found: ID does not exist" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.409758 4919 scope.go:117] "RemoveContainer" containerID="2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.410025 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc"} err="failed to get container status \"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc\": rpc error: code = NotFound desc = could not find container \"2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc\": container with ID starting with 2fac8c319f0883b4859572a193aa3a3b06b6bb8f0591a4cb05d30377cb529cfc not found: ID does not exist" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.410136 4919 scope.go:117] "RemoveContainer" containerID="9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.410383 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76"} err="failed to get container status \"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76\": rpc error: code = NotFound desc = could not find container \"9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76\": container with ID starting with 9918333c1d69cb6037fa2cf1fc262d4f4996928a4b5efd219fd87dae6600bf76 not found: ID does not exist" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.410507 4919 scope.go:117] "RemoveContainer" containerID="9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.410774 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31"} err="failed to get container status \"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31\": rpc error: code = NotFound desc = could not find container \"9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31\": container with ID starting with 9cca8ec01b4a87d8c5b49e33da4ee2ba25e5b65e499a461962fb5724bd260d31 not found: ID does not exist" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.419251 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nrfll"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.420708 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.464731 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-mnrpf"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.465185 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.465770 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjfw8\" (UniqueName: \"kubernetes.io/projected/7fdec3e1-893d-44ec-bd70-90c66e304ba7-kube-api-access-vjfw8\") pod \"nova-api-cdcf-account-create-update-t7ns5\" (UID: \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\") " pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.466868 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.471736 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.495140 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nrfll"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.510794 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.512405 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w82sf\" (UniqueName: \"kubernetes.io/projected/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-kube-api-access-w82sf\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.512568 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz9pl\" (UniqueName: \"kubernetes.io/projected/765c20cf-cede-45c6-867e-c3fa0749238d-kube-api-access-cz9pl\") pod \"nova-cell0-db-create-8585d\" (UID: \"765c20cf-cede-45c6-867e-c3fa0749238d\") " pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.512716 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-config-data\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.512983 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.513221 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-run-httpd\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.513569 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-log-httpd\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.514170 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-scripts\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.514286 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765c20cf-cede-45c6-867e-c3fa0749238d-operator-scripts\") pod \"nova-cell0-db-create-8585d\" (UID: \"765c20cf-cede-45c6-867e-c3fa0749238d\") " pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.515190 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765c20cf-cede-45c6-867e-c3fa0749238d-operator-scripts\") pod \"nova-cell0-db-create-8585d\" (UID: \"765c20cf-cede-45c6-867e-c3fa0749238d\") " pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.518493 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-mnrpf"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.536329 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz9pl\" (UniqueName: \"kubernetes.io/projected/765c20cf-cede-45c6-867e-c3fa0749238d-kube-api-access-cz9pl\") pod \"nova-cell0-db-create-8585d\" (UID: \"765c20cf-cede-45c6-867e-c3fa0749238d\") " pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.553445 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.608027 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616455 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-config-data\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616538 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616577 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-run-httpd\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616606 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48251540-9da9-4f40-b01f-27188fe69056-operator-scripts\") pod \"nova-cell0-dff6-account-create-update-mnrpf\" (UID: \"48251540-9da9-4f40-b01f-27188fe69056\") " pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616648 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-log-httpd\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616674 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-scripts\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616719 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616737 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xfml\" (UniqueName: \"kubernetes.io/projected/4512eb0d-2445-4ab8-833d-80f0500243b6-kube-api-access-7xfml\") pod \"nova-cell1-db-create-nrfll\" (UID: \"4512eb0d-2445-4ab8-833d-80f0500243b6\") " pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616767 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w82sf\" (UniqueName: \"kubernetes.io/projected/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-kube-api-access-w82sf\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616786 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8w2b\" (UniqueName: \"kubernetes.io/projected/48251540-9da9-4f40-b01f-27188fe69056-kube-api-access-q8w2b\") pod \"nova-cell0-dff6-account-create-update-mnrpf\" (UID: \"48251540-9da9-4f40-b01f-27188fe69056\") " pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.616804 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4512eb0d-2445-4ab8-833d-80f0500243b6-operator-scripts\") pod \"nova-cell1-db-create-nrfll\" (UID: \"4512eb0d-2445-4ab8-833d-80f0500243b6\") " pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.618430 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-run-httpd\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.618787 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-log-httpd\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.629025 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.631584 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-scripts\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.636125 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.637051 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-config-data\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.661176 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w82sf\" (UniqueName: \"kubernetes.io/projected/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-kube-api-access-w82sf\") pod \"ceilometer-0\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.661587 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.663909 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-qnmrj"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.665132 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.691877 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.717807 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xfml\" (UniqueName: \"kubernetes.io/projected/4512eb0d-2445-4ab8-833d-80f0500243b6-kube-api-access-7xfml\") pod \"nova-cell1-db-create-nrfll\" (UID: \"4512eb0d-2445-4ab8-833d-80f0500243b6\") " pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.719070 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4512eb0d-2445-4ab8-833d-80f0500243b6-operator-scripts\") pod \"nova-cell1-db-create-nrfll\" (UID: \"4512eb0d-2445-4ab8-833d-80f0500243b6\") " pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.719161 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8w2b\" (UniqueName: \"kubernetes.io/projected/48251540-9da9-4f40-b01f-27188fe69056-kube-api-access-q8w2b\") pod \"nova-cell0-dff6-account-create-update-mnrpf\" (UID: \"48251540-9da9-4f40-b01f-27188fe69056\") " pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.719312 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48251540-9da9-4f40-b01f-27188fe69056-operator-scripts\") pod \"nova-cell0-dff6-account-create-update-mnrpf\" (UID: \"48251540-9da9-4f40-b01f-27188fe69056\") " pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.720015 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48251540-9da9-4f40-b01f-27188fe69056-operator-scripts\") pod \"nova-cell0-dff6-account-create-update-mnrpf\" (UID: \"48251540-9da9-4f40-b01f-27188fe69056\") " pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.720561 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4512eb0d-2445-4ab8-833d-80f0500243b6-operator-scripts\") pod \"nova-cell1-db-create-nrfll\" (UID: \"4512eb0d-2445-4ab8-833d-80f0500243b6\") " pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.727454 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-qnmrj"] Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.742068 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xfml\" (UniqueName: \"kubernetes.io/projected/4512eb0d-2445-4ab8-833d-80f0500243b6-kube-api-access-7xfml\") pod \"nova-cell1-db-create-nrfll\" (UID: \"4512eb0d-2445-4ab8-833d-80f0500243b6\") " pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.756826 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.780568 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8w2b\" (UniqueName: \"kubernetes.io/projected/48251540-9da9-4f40-b01f-27188fe69056-kube-api-access-q8w2b\") pod \"nova-cell0-dff6-account-create-update-mnrpf\" (UID: \"48251540-9da9-4f40-b01f-27188fe69056\") " pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.819762 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.827556 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f16822b-b7de-48ca-8d05-938c50f0837d-operator-scripts\") pod \"nova-cell1-ea40-account-create-update-qnmrj\" (UID: \"8f16822b-b7de-48ca-8d05-938c50f0837d\") " pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.827683 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6jnf\" (UniqueName: \"kubernetes.io/projected/8f16822b-b7de-48ca-8d05-938c50f0837d-kube-api-access-r6jnf\") pod \"nova-cell1-ea40-account-create-update-qnmrj\" (UID: \"8f16822b-b7de-48ca-8d05-938c50f0837d\") " pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.951408 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f16822b-b7de-48ca-8d05-938c50f0837d-operator-scripts\") pod \"nova-cell1-ea40-account-create-update-qnmrj\" (UID: \"8f16822b-b7de-48ca-8d05-938c50f0837d\") " pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.951557 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6jnf\" (UniqueName: \"kubernetes.io/projected/8f16822b-b7de-48ca-8d05-938c50f0837d-kube-api-access-r6jnf\") pod \"nova-cell1-ea40-account-create-update-qnmrj\" (UID: \"8f16822b-b7de-48ca-8d05-938c50f0837d\") " pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.953050 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f16822b-b7de-48ca-8d05-938c50f0837d-operator-scripts\") pod \"nova-cell1-ea40-account-create-update-qnmrj\" (UID: \"8f16822b-b7de-48ca-8d05-938c50f0837d\") " pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:02 crc kubenswrapper[4919]: I0310 22:13:02.984215 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6jnf\" (UniqueName: \"kubernetes.io/projected/8f16822b-b7de-48ca-8d05-938c50f0837d-kube-api-access-r6jnf\") pod \"nova-cell1-ea40-account-create-update-qnmrj\" (UID: \"8f16822b-b7de-48ca-8d05-938c50f0837d\") " pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.084113 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pp7db"] Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.155777 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.228650 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pp7db" event={"ID":"d891cb6e-7d23-40d0-9fd4-28ab980f207c","Type":"ContainerStarted","Data":"f90e9acfcac50fbef71b88e77912162f6a42e653bbf148917e911a9b3de73595"} Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.237208 4919 generic.go:334] "Generic (PLEG): container finished" podID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerID="085c0960e485c0602523fb339f7fbfe3db7d3b23dd248dd8b169e37d44f16776" exitCode=0 Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.237291 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d0533ad-16e6-40b1-be09-46a0d9d9f342","Type":"ContainerDied","Data":"085c0960e485c0602523fb339f7fbfe3db7d3b23dd248dd8b169e37d44f16776"} Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.240212 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.256920 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0533ad-16e6-40b1-be09-46a0d9d9f342-etc-machine-id\") pod \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.257585 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d0533ad-16e6-40b1-be09-46a0d9d9f342-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3d0533ad-16e6-40b1-be09-46a0d9d9f342" (UID: "3d0533ad-16e6-40b1-be09-46a0d9d9f342"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.359901 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-combined-ca-bundle\") pod \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.360001 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data-custom\") pod \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.360066 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwnfw\" (UniqueName: \"kubernetes.io/projected/3d0533ad-16e6-40b1-be09-46a0d9d9f342-kube-api-access-mwnfw\") pod \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.360100 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-scripts\") pod \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.360176 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data\") pod \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\" (UID: \"3d0533ad-16e6-40b1-be09-46a0d9d9f342\") " Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.360749 4919 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0533ad-16e6-40b1-be09-46a0d9d9f342-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.369507 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3d0533ad-16e6-40b1-be09-46a0d9d9f342" (UID: "3d0533ad-16e6-40b1-be09-46a0d9d9f342"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.375243 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-scripts" (OuterVolumeSpecName: "scripts") pod "3d0533ad-16e6-40b1-be09-46a0d9d9f342" (UID: "3d0533ad-16e6-40b1-be09-46a0d9d9f342"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.383221 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d0533ad-16e6-40b1-be09-46a0d9d9f342-kube-api-access-mwnfw" (OuterVolumeSpecName: "kube-api-access-mwnfw") pod "3d0533ad-16e6-40b1-be09-46a0d9d9f342" (UID: "3d0533ad-16e6-40b1-be09-46a0d9d9f342"). InnerVolumeSpecName "kube-api-access-mwnfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.465968 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.466005 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwnfw\" (UniqueName: \"kubernetes.io/projected/3d0533ad-16e6-40b1-be09-46a0d9d9f342-kube-api-access-mwnfw\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.466016 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.472578 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d0533ad-16e6-40b1-be09-46a0d9d9f342" (UID: "3d0533ad-16e6-40b1-be09-46a0d9d9f342"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.473109 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8585d"] Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.514260 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481bd64a-9301-42fa-aa9a-7f6c378917df" path="/var/lib/kubelet/pods/481bd64a-9301-42fa-aa9a-7f6c378917df/volumes" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.519950 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data" (OuterVolumeSpecName: "config-data") pod "3d0533ad-16e6-40b1-be09-46a0d9d9f342" (UID: "3d0533ad-16e6-40b1-be09-46a0d9d9f342"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.568970 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.568992 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0533ad-16e6-40b1-be09-46a0d9d9f342-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.674210 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-mnrpf"] Mar 10 22:13:03 crc kubenswrapper[4919]: W0310 22:13:03.676883 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6dd5ef_e658_47d0_8131_a1718fa0dedb.slice/crio-6f141623d55e74a510eba2c40be25124da4347cbcb74b44382a4fd50966c87c5 WatchSource:0}: Error finding container 6f141623d55e74a510eba2c40be25124da4347cbcb74b44382a4fd50966c87c5: Status 404 returned error can't find the container with id 6f141623d55e74a510eba2c40be25124da4347cbcb74b44382a4fd50966c87c5 Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.684202 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:03 crc kubenswrapper[4919]: W0310 22:13:03.889925 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4512eb0d_2445_4ab8_833d_80f0500243b6.slice/crio-dd194983feb25af217bb773a75a88d16319463c9416b9736642ebc9aa712e876 WatchSource:0}: Error finding container dd194983feb25af217bb773a75a88d16319463c9416b9736642ebc9aa712e876: Status 404 returned error can't find the container with id dd194983feb25af217bb773a75a88d16319463c9416b9736642ebc9aa712e876 Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.890212 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nrfll"] Mar 10 22:13:03 crc kubenswrapper[4919]: I0310 22:13:03.930751 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-t7ns5"] Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.035683 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-qnmrj"] Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.275666 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3d0533ad-16e6-40b1-be09-46a0d9d9f342","Type":"ContainerDied","Data":"f82fb3596f35c7410c78a07021a98778bf2376d4d93dc7dacecc11fc54734dbb"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.275876 4919 scope.go:117] "RemoveContainer" containerID="b6c27257dbc2731eaf36852f0613861b336ec7cc0b8cbeeda0b7a99e38bbba56" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.275722 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.278290 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" event={"ID":"48251540-9da9-4f40-b01f-27188fe69056","Type":"ContainerStarted","Data":"e6abbc8c723ca7696038307a22953b5386b9ff8c8a9d68a9824b94a0392c584f"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.278310 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" event={"ID":"48251540-9da9-4f40-b01f-27188fe69056","Type":"ContainerStarted","Data":"deac9cdb46a57be808d91a281395b66359c8ea56dd1da608b13f6113c7a50f1b"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.287929 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nrfll" event={"ID":"4512eb0d-2445-4ab8-833d-80f0500243b6","Type":"ContainerStarted","Data":"ed00eadc0e3031dc348aef4bf08402de8320f03c5ea490e9824db3c427f042fe"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.287976 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nrfll" event={"ID":"4512eb0d-2445-4ab8-833d-80f0500243b6","Type":"ContainerStarted","Data":"dd194983feb25af217bb773a75a88d16319463c9416b9736642ebc9aa712e876"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.302398 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" podStartSLOduration=2.302361393 podStartE2EDuration="2.302361393s" podCreationTimestamp="2026-03-10 22:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:04.295883457 +0000 UTC m=+1371.537764065" watchObservedRunningTime="2026-03-10 22:13:04.302361393 +0000 UTC m=+1371.544242001" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.304289 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerStarted","Data":"6f141623d55e74a510eba2c40be25124da4347cbcb74b44382a4fd50966c87c5"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.314746 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" event={"ID":"7fdec3e1-893d-44ec-bd70-90c66e304ba7","Type":"ContainerStarted","Data":"144ff57e69648189dc612fb89f4a4cc6778d18f6b1badec4db262b0b5f0688da"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.314799 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" event={"ID":"7fdec3e1-893d-44ec-bd70-90c66e304ba7","Type":"ContainerStarted","Data":"62052b087aaec42d03c93a0a8abb43a06d0764d0c4f04295ea438abdd9aa213a"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.339360 4919 generic.go:334] "Generic (PLEG): container finished" podID="765c20cf-cede-45c6-867e-c3fa0749238d" containerID="28558d9c5a4011f4f1726c6720180e234d9f532065398309212635d2b44ca8dc" exitCode=0 Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.339458 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8585d" event={"ID":"765c20cf-cede-45c6-867e-c3fa0749238d","Type":"ContainerDied","Data":"28558d9c5a4011f4f1726c6720180e234d9f532065398309212635d2b44ca8dc"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.339483 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8585d" event={"ID":"765c20cf-cede-45c6-867e-c3fa0749238d","Type":"ContainerStarted","Data":"0a4d9e29e9cf3bd6f35029166c77b4b6b86355055fbee247011d3259227ba528"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.368685 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.370541 4919 scope.go:117] "RemoveContainer" containerID="085c0960e485c0602523fb339f7fbfe3db7d3b23dd248dd8b169e37d44f16776" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.376839 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" event={"ID":"8f16822b-b7de-48ca-8d05-938c50f0837d","Type":"ContainerStarted","Data":"d6a06432e7a87025091d7e31257f5fa7aa4053b2b47174460ef24368894a7ad6"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.376905 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" event={"ID":"8f16822b-b7de-48ca-8d05-938c50f0837d","Type":"ContainerStarted","Data":"e13f0a5dc3b3de220d83ee5575ba79759dbdc553fd0406e54fb554d368f581c7"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.386352 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pp7db" event={"ID":"d891cb6e-7d23-40d0-9fd4-28ab980f207c","Type":"ContainerDied","Data":"942d637771ae9486432f6cce158ed7da90a899d905ff8b6f64d0376dfe1fb5a3"} Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.383522 4919 generic.go:334] "Generic (PLEG): container finished" podID="d891cb6e-7d23-40d0-9fd4-28ab980f207c" containerID="942d637771ae9486432f6cce158ed7da90a899d905ff8b6f64d0376dfe1fb5a3" exitCode=0 Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.393303 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.415247 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:13:04 crc kubenswrapper[4919]: E0310 22:13:04.415647 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerName="probe" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.415664 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerName="probe" Mar 10 22:13:04 crc kubenswrapper[4919]: E0310 22:13:04.415683 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerName="cinder-scheduler" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.415689 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerName="cinder-scheduler" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.415863 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerName="probe" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.415915 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" containerName="cinder-scheduler" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.416916 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.423965 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.425026 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.438840 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-nrfll" podStartSLOduration=2.43882156 podStartE2EDuration="2.43882156s" podCreationTimestamp="2026-03-10 22:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:04.369721008 +0000 UTC m=+1371.611601616" watchObservedRunningTime="2026-03-10 22:13:04.43882156 +0000 UTC m=+1371.680702168" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.447111 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" podStartSLOduration=2.447094554 podStartE2EDuration="2.447094554s" podCreationTimestamp="2026-03-10 22:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:04.413042621 +0000 UTC m=+1371.654923229" watchObservedRunningTime="2026-03-10 22:13:04.447094554 +0000 UTC m=+1371.688975162" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.487039 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" podStartSLOduration=2.487020666 podStartE2EDuration="2.487020666s" podCreationTimestamp="2026-03-10 22:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:04.461413931 +0000 UTC m=+1371.703294539" watchObservedRunningTime="2026-03-10 22:13:04.487020666 +0000 UTC m=+1371.728901264" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.495274 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.495382 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.495470 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.495689 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.495727 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c282s\" (UniqueName: \"kubernetes.io/projected/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-kube-api-access-c282s\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.495767 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.596978 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c282s\" (UniqueName: \"kubernetes.io/projected/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-kube-api-access-c282s\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.597049 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.597137 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.597174 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.597197 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.597256 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.598053 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.603015 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.603159 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.604637 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.606946 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.616900 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c282s\" (UniqueName: \"kubernetes.io/projected/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-kube-api-access-c282s\") pod \"cinder-scheduler-0\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " pod="openstack/cinder-scheduler-0" Mar 10 22:13:04 crc kubenswrapper[4919]: I0310 22:13:04.755178 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.246314 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:13:05 crc kubenswrapper[4919]: W0310 22:13:05.255023 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1f5a3b8_c9ca_403a_aecf_f6fbf286b145.slice/crio-6bf1a770ba875694b09781b4028dc4d2d064955bd543ca5b52b1039d941ab946 WatchSource:0}: Error finding container 6bf1a770ba875694b09781b4028dc4d2d064955bd543ca5b52b1039d941ab946: Status 404 returned error can't find the container with id 6bf1a770ba875694b09781b4028dc4d2d064955bd543ca5b52b1039d941ab946 Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.399012 4919 generic.go:334] "Generic (PLEG): container finished" podID="48251540-9da9-4f40-b01f-27188fe69056" containerID="e6abbc8c723ca7696038307a22953b5386b9ff8c8a9d68a9824b94a0392c584f" exitCode=0 Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.399074 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" event={"ID":"48251540-9da9-4f40-b01f-27188fe69056","Type":"ContainerDied","Data":"e6abbc8c723ca7696038307a22953b5386b9ff8c8a9d68a9824b94a0392c584f"} Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.401921 4919 generic.go:334] "Generic (PLEG): container finished" podID="4512eb0d-2445-4ab8-833d-80f0500243b6" containerID="ed00eadc0e3031dc348aef4bf08402de8320f03c5ea490e9824db3c427f042fe" exitCode=0 Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.401972 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nrfll" event={"ID":"4512eb0d-2445-4ab8-833d-80f0500243b6","Type":"ContainerDied","Data":"ed00eadc0e3031dc348aef4bf08402de8320f03c5ea490e9824db3c427f042fe"} Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.403886 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerStarted","Data":"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767"} Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.403909 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerStarted","Data":"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2"} Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.405294 4919 generic.go:334] "Generic (PLEG): container finished" podID="7fdec3e1-893d-44ec-bd70-90c66e304ba7" containerID="144ff57e69648189dc612fb89f4a4cc6778d18f6b1badec4db262b0b5f0688da" exitCode=0 Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.405325 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" event={"ID":"7fdec3e1-893d-44ec-bd70-90c66e304ba7","Type":"ContainerDied","Data":"144ff57e69648189dc612fb89f4a4cc6778d18f6b1badec4db262b0b5f0688da"} Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.406856 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145","Type":"ContainerStarted","Data":"6bf1a770ba875694b09781b4028dc4d2d064955bd543ca5b52b1039d941ab946"} Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.433278 4919 generic.go:334] "Generic (PLEG): container finished" podID="8f16822b-b7de-48ca-8d05-938c50f0837d" containerID="d6a06432e7a87025091d7e31257f5fa7aa4053b2b47174460ef24368894a7ad6" exitCode=0 Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.433382 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" event={"ID":"8f16822b-b7de-48ca-8d05-938c50f0837d","Type":"ContainerDied","Data":"d6a06432e7a87025091d7e31257f5fa7aa4053b2b47174460ef24368894a7ad6"} Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.495278 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d0533ad-16e6-40b1-be09-46a0d9d9f342" path="/var/lib/kubelet/pods/3d0533ad-16e6-40b1-be09-46a0d9d9f342/volumes" Mar 10 22:13:05 crc kubenswrapper[4919]: I0310 22:13:05.987058 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.037006 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.121384 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz9pl\" (UniqueName: \"kubernetes.io/projected/765c20cf-cede-45c6-867e-c3fa0749238d-kube-api-access-cz9pl\") pod \"765c20cf-cede-45c6-867e-c3fa0749238d\" (UID: \"765c20cf-cede-45c6-867e-c3fa0749238d\") " Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.121570 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765c20cf-cede-45c6-867e-c3fa0749238d-operator-scripts\") pod \"765c20cf-cede-45c6-867e-c3fa0749238d\" (UID: \"765c20cf-cede-45c6-867e-c3fa0749238d\") " Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.122608 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765c20cf-cede-45c6-867e-c3fa0749238d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "765c20cf-cede-45c6-867e-c3fa0749238d" (UID: "765c20cf-cede-45c6-867e-c3fa0749238d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.125496 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765c20cf-cede-45c6-867e-c3fa0749238d-kube-api-access-cz9pl" (OuterVolumeSpecName: "kube-api-access-cz9pl") pod "765c20cf-cede-45c6-867e-c3fa0749238d" (UID: "765c20cf-cede-45c6-867e-c3fa0749238d"). InnerVolumeSpecName "kube-api-access-cz9pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.219741 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.225651 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqlsm\" (UniqueName: \"kubernetes.io/projected/d891cb6e-7d23-40d0-9fd4-28ab980f207c-kube-api-access-dqlsm\") pod \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\" (UID: \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\") " Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.225771 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d891cb6e-7d23-40d0-9fd4-28ab980f207c-operator-scripts\") pod \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\" (UID: \"d891cb6e-7d23-40d0-9fd4-28ab980f207c\") " Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.226301 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d891cb6e-7d23-40d0-9fd4-28ab980f207c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d891cb6e-7d23-40d0-9fd4-28ab980f207c" (UID: "d891cb6e-7d23-40d0-9fd4-28ab980f207c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.227108 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz9pl\" (UniqueName: \"kubernetes.io/projected/765c20cf-cede-45c6-867e-c3fa0749238d-kube-api-access-cz9pl\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.227135 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765c20cf-cede-45c6-867e-c3fa0749238d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.227145 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d891cb6e-7d23-40d0-9fd4-28ab980f207c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.230177 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d891cb6e-7d23-40d0-9fd4-28ab980f207c-kube-api-access-dqlsm" (OuterVolumeSpecName: "kube-api-access-dqlsm") pod "d891cb6e-7d23-40d0-9fd4-28ab980f207c" (UID: "d891cb6e-7d23-40d0-9fd4-28ab980f207c"). InnerVolumeSpecName "kube-api-access-dqlsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.327728 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqlsm\" (UniqueName: \"kubernetes.io/projected/d891cb6e-7d23-40d0-9fd4-28ab980f207c-kube-api-access-dqlsm\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.453580 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8585d" event={"ID":"765c20cf-cede-45c6-867e-c3fa0749238d","Type":"ContainerDied","Data":"0a4d9e29e9cf3bd6f35029166c77b4b6b86355055fbee247011d3259227ba528"} Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.453619 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4d9e29e9cf3bd6f35029166c77b4b6b86355055fbee247011d3259227ba528" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.453672 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8585d" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.460688 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145","Type":"ContainerStarted","Data":"8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923"} Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.470028 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pp7db" Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.470267 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pp7db" event={"ID":"d891cb6e-7d23-40d0-9fd4-28ab980f207c","Type":"ContainerDied","Data":"f90e9acfcac50fbef71b88e77912162f6a42e653bbf148917e911a9b3de73595"} Mar 10 22:13:06 crc kubenswrapper[4919]: I0310 22:13:06.470324 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f90e9acfcac50fbef71b88e77912162f6a42e653bbf148917e911a9b3de73595" Mar 10 22:13:10 crc kubenswrapper[4919]: I0310 22:13:10.211213 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:10 crc kubenswrapper[4919]: I0310 22:13:10.214842 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.234165 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.559483 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" event={"ID":"8f16822b-b7de-48ca-8d05-938c50f0837d","Type":"ContainerDied","Data":"e13f0a5dc3b3de220d83ee5575ba79759dbdc553fd0406e54fb554d368f581c7"} Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.559762 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13f0a5dc3b3de220d83ee5575ba79759dbdc553fd0406e54fb554d368f581c7" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.561732 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" event={"ID":"48251540-9da9-4f40-b01f-27188fe69056","Type":"ContainerDied","Data":"deac9cdb46a57be808d91a281395b66359c8ea56dd1da608b13f6113c7a50f1b"} Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.561756 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deac9cdb46a57be808d91a281395b66359c8ea56dd1da608b13f6113c7a50f1b" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.562996 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nrfll" event={"ID":"4512eb0d-2445-4ab8-833d-80f0500243b6","Type":"ContainerDied","Data":"dd194983feb25af217bb773a75a88d16319463c9416b9736642ebc9aa712e876"} Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.563022 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd194983feb25af217bb773a75a88d16319463c9416b9736642ebc9aa712e876" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.564325 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" event={"ID":"7fdec3e1-893d-44ec-bd70-90c66e304ba7","Type":"ContainerDied","Data":"62052b087aaec42d03c93a0a8abb43a06d0764d0c4f04295ea438abdd9aa213a"} Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.564347 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62052b087aaec42d03c93a0a8abb43a06d0764d0c4f04295ea438abdd9aa213a" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.608157 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.615004 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.618425 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.632378 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.669261 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6jnf\" (UniqueName: \"kubernetes.io/projected/8f16822b-b7de-48ca-8d05-938c50f0837d-kube-api-access-r6jnf\") pod \"8f16822b-b7de-48ca-8d05-938c50f0837d\" (UID: \"8f16822b-b7de-48ca-8d05-938c50f0837d\") " Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.669460 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f16822b-b7de-48ca-8d05-938c50f0837d-operator-scripts\") pod \"8f16822b-b7de-48ca-8d05-938c50f0837d\" (UID: \"8f16822b-b7de-48ca-8d05-938c50f0837d\") " Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.669494 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xfml\" (UniqueName: \"kubernetes.io/projected/4512eb0d-2445-4ab8-833d-80f0500243b6-kube-api-access-7xfml\") pod \"4512eb0d-2445-4ab8-833d-80f0500243b6\" (UID: \"4512eb0d-2445-4ab8-833d-80f0500243b6\") " Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.669602 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4512eb0d-2445-4ab8-833d-80f0500243b6-operator-scripts\") pod \"4512eb0d-2445-4ab8-833d-80f0500243b6\" (UID: \"4512eb0d-2445-4ab8-833d-80f0500243b6\") " Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.670301 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8w2b\" (UniqueName: \"kubernetes.io/projected/48251540-9da9-4f40-b01f-27188fe69056-kube-api-access-q8w2b\") pod \"48251540-9da9-4f40-b01f-27188fe69056\" (UID: \"48251540-9da9-4f40-b01f-27188fe69056\") " Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.670340 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjfw8\" (UniqueName: \"kubernetes.io/projected/7fdec3e1-893d-44ec-bd70-90c66e304ba7-kube-api-access-vjfw8\") pod \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\" (UID: \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\") " Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.670425 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fdec3e1-893d-44ec-bd70-90c66e304ba7-operator-scripts\") pod \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\" (UID: \"7fdec3e1-893d-44ec-bd70-90c66e304ba7\") " Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.670451 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48251540-9da9-4f40-b01f-27188fe69056-operator-scripts\") pod \"48251540-9da9-4f40-b01f-27188fe69056\" (UID: \"48251540-9da9-4f40-b01f-27188fe69056\") " Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.671577 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48251540-9da9-4f40-b01f-27188fe69056-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48251540-9da9-4f40-b01f-27188fe69056" (UID: "48251540-9da9-4f40-b01f-27188fe69056"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.672027 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fdec3e1-893d-44ec-bd70-90c66e304ba7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7fdec3e1-893d-44ec-bd70-90c66e304ba7" (UID: "7fdec3e1-893d-44ec-bd70-90c66e304ba7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.674966 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48251540-9da9-4f40-b01f-27188fe69056-kube-api-access-q8w2b" (OuterVolumeSpecName: "kube-api-access-q8w2b") pod "48251540-9da9-4f40-b01f-27188fe69056" (UID: "48251540-9da9-4f40-b01f-27188fe69056"). InnerVolumeSpecName "kube-api-access-q8w2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.675097 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fdec3e1-893d-44ec-bd70-90c66e304ba7-kube-api-access-vjfw8" (OuterVolumeSpecName: "kube-api-access-vjfw8") pod "7fdec3e1-893d-44ec-bd70-90c66e304ba7" (UID: "7fdec3e1-893d-44ec-bd70-90c66e304ba7"). InnerVolumeSpecName "kube-api-access-vjfw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.675773 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4512eb0d-2445-4ab8-833d-80f0500243b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4512eb0d-2445-4ab8-833d-80f0500243b6" (UID: "4512eb0d-2445-4ab8-833d-80f0500243b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.676833 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f16822b-b7de-48ca-8d05-938c50f0837d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f16822b-b7de-48ca-8d05-938c50f0837d" (UID: "8f16822b-b7de-48ca-8d05-938c50f0837d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.679036 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f16822b-b7de-48ca-8d05-938c50f0837d-kube-api-access-r6jnf" (OuterVolumeSpecName: "kube-api-access-r6jnf") pod "8f16822b-b7de-48ca-8d05-938c50f0837d" (UID: "8f16822b-b7de-48ca-8d05-938c50f0837d"). InnerVolumeSpecName "kube-api-access-r6jnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.683960 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4512eb0d-2445-4ab8-833d-80f0500243b6-kube-api-access-7xfml" (OuterVolumeSpecName: "kube-api-access-7xfml") pod "4512eb0d-2445-4ab8-833d-80f0500243b6" (UID: "4512eb0d-2445-4ab8-833d-80f0500243b6"). InnerVolumeSpecName "kube-api-access-7xfml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.776927 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6jnf\" (UniqueName: \"kubernetes.io/projected/8f16822b-b7de-48ca-8d05-938c50f0837d-kube-api-access-r6jnf\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.776969 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f16822b-b7de-48ca-8d05-938c50f0837d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.776979 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xfml\" (UniqueName: \"kubernetes.io/projected/4512eb0d-2445-4ab8-833d-80f0500243b6-kube-api-access-7xfml\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.776989 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4512eb0d-2445-4ab8-833d-80f0500243b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.776997 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8w2b\" (UniqueName: \"kubernetes.io/projected/48251540-9da9-4f40-b01f-27188fe69056-kube-api-access-q8w2b\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.777007 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjfw8\" (UniqueName: \"kubernetes.io/projected/7fdec3e1-893d-44ec-bd70-90c66e304ba7-kube-api-access-vjfw8\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.777015 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fdec3e1-893d-44ec-bd70-90c66e304ba7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:12 crc kubenswrapper[4919]: I0310 22:13:12.777024 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48251540-9da9-4f40-b01f-27188fe69056-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.582857 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6bff1404-f9b1-48f8-b093-95c3bb206c6a","Type":"ContainerStarted","Data":"9356369f3992f46c3635f399cef0e5e3d0db351590dc5b463547b642e4ea9e30"} Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.586291 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerStarted","Data":"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632"} Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.589206 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nrfll" Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.589355 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145","Type":"ContainerStarted","Data":"3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846"} Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.589723 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea40-account-create-update-qnmrj" Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.590214 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dff6-account-create-update-mnrpf" Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.591950 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cdcf-account-create-update-t7ns5" Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.605901 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.63090949 podStartE2EDuration="17.605885479s" podCreationTimestamp="2026-03-10 22:12:56 +0000 UTC" firstStartedPulling="2026-03-10 22:12:57.749805948 +0000 UTC m=+1364.991686556" lastFinishedPulling="2026-03-10 22:13:12.724781937 +0000 UTC m=+1379.966662545" observedRunningTime="2026-03-10 22:13:13.605122089 +0000 UTC m=+1380.847002697" watchObservedRunningTime="2026-03-10 22:13:13.605885479 +0000 UTC m=+1380.847766087" Mar 10 22:13:13 crc kubenswrapper[4919]: I0310 22:13:13.658115 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.658096565 podStartE2EDuration="9.658096565s" podCreationTimestamp="2026-03-10 22:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:13.655625958 +0000 UTC m=+1380.897506566" watchObservedRunningTime="2026-03-10 22:13:13.658096565 +0000 UTC m=+1380.899977173" Mar 10 22:13:14 crc kubenswrapper[4919]: I0310 22:13:14.377147 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:13:14 crc kubenswrapper[4919]: I0310 22:13:14.464856 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79589c5bbb-z9p5z"] Mar 10 22:13:14 crc kubenswrapper[4919]: I0310 22:13:14.465152 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79589c5bbb-z9p5z" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerName="neutron-api" containerID="cri-o://c41bc0e2002aec40f136ba1e45796303ef87c5f297502cf12266931ccf978aff" gracePeriod=30 Mar 10 22:13:14 crc kubenswrapper[4919]: I0310 22:13:14.465205 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79589c5bbb-z9p5z" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerName="neutron-httpd" containerID="cri-o://fc5f8917f23eea3338b4fb144ea30d5b228b7a0c8dc4c7d6918e3ba4cebe024f" gracePeriod=30 Mar 10 22:13:14 crc kubenswrapper[4919]: I0310 22:13:14.602988 4919 generic.go:334] "Generic (PLEG): container finished" podID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerID="fc5f8917f23eea3338b4fb144ea30d5b228b7a0c8dc4c7d6918e3ba4cebe024f" exitCode=0 Mar 10 22:13:14 crc kubenswrapper[4919]: I0310 22:13:14.603815 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79589c5bbb-z9p5z" event={"ID":"4fe9eaba-0336-4655-b2d9-9bd67261da54","Type":"ContainerDied","Data":"fc5f8917f23eea3338b4fb144ea30d5b228b7a0c8dc4c7d6918e3ba4cebe024f"} Mar 10 22:13:14 crc kubenswrapper[4919]: I0310 22:13:14.756298 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 22:13:14 crc kubenswrapper[4919]: I0310 22:13:14.757572 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.184:8080/\": dial tcp 10.217.0.184:8080: connect: connection refused" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.775917 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zr9cc"] Mar 10 22:13:17 crc kubenswrapper[4919]: E0310 22:13:17.776813 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdec3e1-893d-44ec-bd70-90c66e304ba7" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.776828 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdec3e1-893d-44ec-bd70-90c66e304ba7" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: E0310 22:13:17.776844 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d891cb6e-7d23-40d0-9fd4-28ab980f207c" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.776850 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d891cb6e-7d23-40d0-9fd4-28ab980f207c" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: E0310 22:13:17.776862 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48251540-9da9-4f40-b01f-27188fe69056" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.776868 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="48251540-9da9-4f40-b01f-27188fe69056" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: E0310 22:13:17.776888 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4512eb0d-2445-4ab8-833d-80f0500243b6" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.776894 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4512eb0d-2445-4ab8-833d-80f0500243b6" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: E0310 22:13:17.776910 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f16822b-b7de-48ca-8d05-938c50f0837d" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.776916 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f16822b-b7de-48ca-8d05-938c50f0837d" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: E0310 22:13:17.776930 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765c20cf-cede-45c6-867e-c3fa0749238d" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.776935 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="765c20cf-cede-45c6-867e-c3fa0749238d" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.777096 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f16822b-b7de-48ca-8d05-938c50f0837d" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.777109 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4512eb0d-2445-4ab8-833d-80f0500243b6" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.777121 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdec3e1-893d-44ec-bd70-90c66e304ba7" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.777132 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d891cb6e-7d23-40d0-9fd4-28ab980f207c" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.777140 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="48251540-9da9-4f40-b01f-27188fe69056" containerName="mariadb-account-create-update" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.777150 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="765c20cf-cede-45c6-867e-c3fa0749238d" containerName="mariadb-database-create" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.777882 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.779807 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.780152 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m77rf" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.780323 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.785099 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zr9cc"] Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.972313 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-config-data\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.972774 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-scripts\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.972860 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:17 crc kubenswrapper[4919]: I0310 22:13:17.972950 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c4dk\" (UniqueName: \"kubernetes.io/projected/0a8d8a3d-169b-4fea-9848-b8998625b1d2-kube-api-access-2c4dk\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.075144 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-scripts\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.075240 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.075317 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c4dk\" (UniqueName: \"kubernetes.io/projected/0a8d8a3d-169b-4fea-9848-b8998625b1d2-kube-api-access-2c4dk\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.075365 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-config-data\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.082827 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-scripts\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.083490 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.095738 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-config-data\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.109520 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c4dk\" (UniqueName: \"kubernetes.io/projected/0a8d8a3d-169b-4fea-9848-b8998625b1d2-kube-api-access-2c4dk\") pod \"nova-cell0-conductor-db-sync-zr9cc\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.190421 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.640646 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zr9cc"] Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.665272 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerStarted","Data":"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45"} Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.665473 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="ceilometer-central-agent" containerID="cri-o://a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2" gracePeriod=30 Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.665704 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.665951 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="proxy-httpd" containerID="cri-o://0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45" gracePeriod=30 Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.665999 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="sg-core" containerID="cri-o://21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632" gracePeriod=30 Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.666032 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="ceilometer-notification-agent" containerID="cri-o://6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767" gracePeriod=30 Mar 10 22:13:18 crc kubenswrapper[4919]: I0310 22:13:18.699382 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.700469128 podStartE2EDuration="16.699362691s" podCreationTimestamp="2026-03-10 22:13:02 +0000 UTC" firstStartedPulling="2026-03-10 22:13:03.702635333 +0000 UTC m=+1370.944515941" lastFinishedPulling="2026-03-10 22:13:17.701528896 +0000 UTC m=+1384.943409504" observedRunningTime="2026-03-10 22:13:18.687464339 +0000 UTC m=+1385.929344957" watchObservedRunningTime="2026-03-10 22:13:18.699362691 +0000 UTC m=+1385.941243289" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.470383 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.599536 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-config-data\") pod \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.599586 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-sg-core-conf-yaml\") pod \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.599616 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-run-httpd\") pod \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.599702 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w82sf\" (UniqueName: \"kubernetes.io/projected/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-kube-api-access-w82sf\") pod \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.599732 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-combined-ca-bundle\") pod \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.599760 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-log-httpd\") pod \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.599814 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-scripts\") pod \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\" (UID: \"4b6dd5ef-e658-47d0-8131-a1718fa0dedb\") " Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.600940 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b6dd5ef-e658-47d0-8131-a1718fa0dedb" (UID: "4b6dd5ef-e658-47d0-8131-a1718fa0dedb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.601163 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b6dd5ef-e658-47d0-8131-a1718fa0dedb" (UID: "4b6dd5ef-e658-47d0-8131-a1718fa0dedb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.607730 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-scripts" (OuterVolumeSpecName: "scripts") pod "4b6dd5ef-e658-47d0-8131-a1718fa0dedb" (UID: "4b6dd5ef-e658-47d0-8131-a1718fa0dedb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.607786 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-kube-api-access-w82sf" (OuterVolumeSpecName: "kube-api-access-w82sf") pod "4b6dd5ef-e658-47d0-8131-a1718fa0dedb" (UID: "4b6dd5ef-e658-47d0-8131-a1718fa0dedb"). InnerVolumeSpecName "kube-api-access-w82sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.634267 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b6dd5ef-e658-47d0-8131-a1718fa0dedb" (UID: "4b6dd5ef-e658-47d0-8131-a1718fa0dedb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.699924 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b6dd5ef-e658-47d0-8131-a1718fa0dedb" (UID: "4b6dd5ef-e658-47d0-8131-a1718fa0dedb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.701597 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w82sf\" (UniqueName: \"kubernetes.io/projected/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-kube-api-access-w82sf\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.701625 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.701634 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.701645 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.701654 4919 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.701662 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.734635 4919 generic.go:334] "Generic (PLEG): container finished" podID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerID="0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45" exitCode=0 Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735193 4919 generic.go:334] "Generic (PLEG): container finished" podID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerID="21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632" exitCode=2 Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735203 4919 generic.go:334] "Generic (PLEG): container finished" podID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerID="6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767" exitCode=0 Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735210 4919 generic.go:334] "Generic (PLEG): container finished" podID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerID="a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2" exitCode=0 Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735254 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerDied","Data":"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45"} Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735282 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerDied","Data":"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632"} Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735294 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerDied","Data":"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767"} Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735302 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerDied","Data":"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2"} Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735311 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b6dd5ef-e658-47d0-8131-a1718fa0dedb","Type":"ContainerDied","Data":"6f141623d55e74a510eba2c40be25124da4347cbcb74b44382a4fd50966c87c5"} Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735325 4919 scope.go:117] "RemoveContainer" containerID="0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.735462 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.737717 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-config-data" (OuterVolumeSpecName: "config-data") pod "4b6dd5ef-e658-47d0-8131-a1718fa0dedb" (UID: "4b6dd5ef-e658-47d0-8131-a1718fa0dedb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.745079 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" event={"ID":"0a8d8a3d-169b-4fea-9848-b8998625b1d2","Type":"ContainerStarted","Data":"d893a4e6247512b560e8911e765b5fc93f7044209782fb5a2d0fd82b136a4547"} Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.803411 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6dd5ef-e658-47d0-8131-a1718fa0dedb-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.835849 4919 scope.go:117] "RemoveContainer" containerID="21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.861801 4919 scope.go:117] "RemoveContainer" containerID="6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.887225 4919 scope.go:117] "RemoveContainer" containerID="a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.913467 4919 scope.go:117] "RemoveContainer" containerID="0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45" Mar 10 22:13:19 crc kubenswrapper[4919]: E0310 22:13:19.914860 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": container with ID starting with 0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45 not found: ID does not exist" containerID="0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.914915 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45"} err="failed to get container status \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": rpc error: code = NotFound desc = could not find container \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": container with ID starting with 0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.914947 4919 scope.go:117] "RemoveContainer" containerID="21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632" Mar 10 22:13:19 crc kubenswrapper[4919]: E0310 22:13:19.915461 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": container with ID starting with 21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632 not found: ID does not exist" containerID="21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.915501 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632"} err="failed to get container status \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": rpc error: code = NotFound desc = could not find container \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": container with ID starting with 21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.915527 4919 scope.go:117] "RemoveContainer" containerID="6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767" Mar 10 22:13:19 crc kubenswrapper[4919]: E0310 22:13:19.915858 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": container with ID starting with 6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767 not found: ID does not exist" containerID="6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.915884 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767"} err="failed to get container status \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": rpc error: code = NotFound desc = could not find container \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": container with ID starting with 6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.915901 4919 scope.go:117] "RemoveContainer" containerID="a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2" Mar 10 22:13:19 crc kubenswrapper[4919]: E0310 22:13:19.916158 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": container with ID starting with a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2 not found: ID does not exist" containerID="a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.916179 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2"} err="failed to get container status \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": rpc error: code = NotFound desc = could not find container \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": container with ID starting with a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.916192 4919 scope.go:117] "RemoveContainer" containerID="0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.916573 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45"} err="failed to get container status \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": rpc error: code = NotFound desc = could not find container \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": container with ID starting with 0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.916595 4919 scope.go:117] "RemoveContainer" containerID="21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.916983 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632"} err="failed to get container status \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": rpc error: code = NotFound desc = could not find container \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": container with ID starting with 21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.917015 4919 scope.go:117] "RemoveContainer" containerID="6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.917365 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767"} err="failed to get container status \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": rpc error: code = NotFound desc = could not find container \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": container with ID starting with 6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.917418 4919 scope.go:117] "RemoveContainer" containerID="a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.917795 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2"} err="failed to get container status \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": rpc error: code = NotFound desc = could not find container \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": container with ID starting with a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.917816 4919 scope.go:117] "RemoveContainer" containerID="0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.918080 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45"} err="failed to get container status \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": rpc error: code = NotFound desc = could not find container \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": container with ID starting with 0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.918097 4919 scope.go:117] "RemoveContainer" containerID="21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.918351 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632"} err="failed to get container status \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": rpc error: code = NotFound desc = could not find container \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": container with ID starting with 21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.918374 4919 scope.go:117] "RemoveContainer" containerID="6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.918934 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767"} err="failed to get container status \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": rpc error: code = NotFound desc = could not find container \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": container with ID starting with 6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.918954 4919 scope.go:117] "RemoveContainer" containerID="a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.919280 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2"} err="failed to get container status \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": rpc error: code = NotFound desc = could not find container \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": container with ID starting with a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.919310 4919 scope.go:117] "RemoveContainer" containerID="0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.919818 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45"} err="failed to get container status \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": rpc error: code = NotFound desc = could not find container \"0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45\": container with ID starting with 0535d625a894f13cb47c636cedb85bd7862741ec7644cfb127ddeee2cf653f45 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.919837 4919 scope.go:117] "RemoveContainer" containerID="21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.920163 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632"} err="failed to get container status \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": rpc error: code = NotFound desc = could not find container \"21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632\": container with ID starting with 21fef86566b660bfdb91a3b9b5826d07dd75c8c98ebd24af63cc8f06f1e88632 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.920194 4919 scope.go:117] "RemoveContainer" containerID="6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.920520 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767"} err="failed to get container status \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": rpc error: code = NotFound desc = could not find container \"6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767\": container with ID starting with 6936cc85189c34bd4bcadbb9d2d09132f0151ea4574cab23fecd0d8c0f6cd767 not found: ID does not exist" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.920544 4919 scope.go:117] "RemoveContainer" containerID="a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2" Mar 10 22:13:19 crc kubenswrapper[4919]: I0310 22:13:19.920744 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2"} err="failed to get container status \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": rpc error: code = NotFound desc = could not find container \"a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2\": container with ID starting with a2b00f335c6daadf19b24b97742cce28ff79e5ad2ab1639d6f0c8fdf580e84e2 not found: ID does not exist" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.028472 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.078244 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.086126 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.104650 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:20 crc kubenswrapper[4919]: E0310 22:13:20.105078 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="ceilometer-notification-agent" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.105097 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="ceilometer-notification-agent" Mar 10 22:13:20 crc kubenswrapper[4919]: E0310 22:13:20.105111 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="proxy-httpd" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.105117 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="proxy-httpd" Mar 10 22:13:20 crc kubenswrapper[4919]: E0310 22:13:20.105130 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="sg-core" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.105136 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="sg-core" Mar 10 22:13:20 crc kubenswrapper[4919]: E0310 22:13:20.105164 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="ceilometer-central-agent" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.105169 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="ceilometer-central-agent" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.105322 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="ceilometer-notification-agent" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.105342 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="sg-core" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.105356 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="proxy-httpd" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.105364 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" containerName="ceilometer-central-agent" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.107219 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.109436 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.109632 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.115491 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.245216 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-config-data\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.245313 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.245340 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.245381 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qpr\" (UniqueName: \"kubernetes.io/projected/1414266b-7e10-49f9-8a97-3fe6238cd61c-kube-api-access-n4qpr\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.245423 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-scripts\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.245442 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-log-httpd\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.245467 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-run-httpd\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.347812 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-log-httpd\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.347888 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-run-httpd\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.347964 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-config-data\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.348064 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.348101 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.348170 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qpr\" (UniqueName: \"kubernetes.io/projected/1414266b-7e10-49f9-8a97-3fe6238cd61c-kube-api-access-n4qpr\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.348222 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-scripts\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.348808 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-run-httpd\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.348874 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-log-httpd\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.353579 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.355456 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-scripts\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.360176 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-config-data\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.382821 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.398942 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qpr\" (UniqueName: \"kubernetes.io/projected/1414266b-7e10-49f9-8a97-3fe6238cd61c-kube-api-access-n4qpr\") pod \"ceilometer-0\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.480006 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.850951 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:20 crc kubenswrapper[4919]: I0310 22:13:20.950492 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:21 crc kubenswrapper[4919]: I0310 22:13:21.492159 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b6dd5ef-e658-47d0-8131-a1718fa0dedb" path="/var/lib/kubelet/pods/4b6dd5ef-e658-47d0-8131-a1718fa0dedb/volumes" Mar 10 22:13:21 crc kubenswrapper[4919]: I0310 22:13:21.779100 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerStarted","Data":"a257ec7adcb6a4da68990b22f8e8a3e35d3c6e045d3438c7c2f16c5507d6673f"} Mar 10 22:13:21 crc kubenswrapper[4919]: I0310 22:13:21.782371 4919 generic.go:334] "Generic (PLEG): container finished" podID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerID="c41bc0e2002aec40f136ba1e45796303ef87c5f297502cf12266931ccf978aff" exitCode=0 Mar 10 22:13:21 crc kubenswrapper[4919]: I0310 22:13:21.782485 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79589c5bbb-z9p5z" event={"ID":"4fe9eaba-0336-4655-b2d9-9bd67261da54","Type":"ContainerDied","Data":"c41bc0e2002aec40f136ba1e45796303ef87c5f297502cf12266931ccf978aff"} Mar 10 22:13:21 crc kubenswrapper[4919]: I0310 22:13:21.991721 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.180169 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-config\") pod \"4fe9eaba-0336-4655-b2d9-9bd67261da54\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.180561 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wx84\" (UniqueName: \"kubernetes.io/projected/4fe9eaba-0336-4655-b2d9-9bd67261da54-kube-api-access-7wx84\") pod \"4fe9eaba-0336-4655-b2d9-9bd67261da54\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.180780 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-ovndb-tls-certs\") pod \"4fe9eaba-0336-4655-b2d9-9bd67261da54\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.180902 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-httpd-config\") pod \"4fe9eaba-0336-4655-b2d9-9bd67261da54\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.181079 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-combined-ca-bundle\") pod \"4fe9eaba-0336-4655-b2d9-9bd67261da54\" (UID: \"4fe9eaba-0336-4655-b2d9-9bd67261da54\") " Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.192257 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe9eaba-0336-4655-b2d9-9bd67261da54-kube-api-access-7wx84" (OuterVolumeSpecName: "kube-api-access-7wx84") pod "4fe9eaba-0336-4655-b2d9-9bd67261da54" (UID: "4fe9eaba-0336-4655-b2d9-9bd67261da54"). InnerVolumeSpecName "kube-api-access-7wx84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.195589 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4fe9eaba-0336-4655-b2d9-9bd67261da54" (UID: "4fe9eaba-0336-4655-b2d9-9bd67261da54"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.238627 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-config" (OuterVolumeSpecName: "config") pod "4fe9eaba-0336-4655-b2d9-9bd67261da54" (UID: "4fe9eaba-0336-4655-b2d9-9bd67261da54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.244114 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fe9eaba-0336-4655-b2d9-9bd67261da54" (UID: "4fe9eaba-0336-4655-b2d9-9bd67261da54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.288850 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.289095 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wx84\" (UniqueName: \"kubernetes.io/projected/4fe9eaba-0336-4655-b2d9-9bd67261da54-kube-api-access-7wx84\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.289156 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.289210 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.329530 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4fe9eaba-0336-4655-b2d9-9bd67261da54" (UID: "4fe9eaba-0336-4655-b2d9-9bd67261da54"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.391716 4919 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe9eaba-0336-4655-b2d9-9bd67261da54-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.800756 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79589c5bbb-z9p5z" event={"ID":"4fe9eaba-0336-4655-b2d9-9bd67261da54","Type":"ContainerDied","Data":"52a09b1c3a7d8d81ab31e1af2550644d2817b5fffcc27dee15398dc75d21fb39"} Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.801066 4919 scope.go:117] "RemoveContainer" containerID="fc5f8917f23eea3338b4fb144ea30d5b228b7a0c8dc4c7d6918e3ba4cebe024f" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.801211 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79589c5bbb-z9p5z" Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.815805 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerStarted","Data":"1a23096ba1de1649229865bb2ad45b1f791571f42d8d38b1370e90773a9e66db"} Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.815856 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerStarted","Data":"d663d84dc97fb469b9aa16cb833c1c32c425dd5e6f6b94be4acbd8b521b493f3"} Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.835462 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79589c5bbb-z9p5z"] Mar 10 22:13:22 crc kubenswrapper[4919]: I0310 22:13:22.842951 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79589c5bbb-z9p5z"] Mar 10 22:13:23 crc kubenswrapper[4919]: I0310 22:13:23.495377 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" path="/var/lib/kubelet/pods/4fe9eaba-0336-4655-b2d9-9bd67261da54/volumes" Mar 10 22:13:27 crc kubenswrapper[4919]: I0310 22:13:27.345730 4919 scope.go:117] "RemoveContainer" containerID="c41bc0e2002aec40f136ba1e45796303ef87c5f297502cf12266931ccf978aff" Mar 10 22:13:27 crc kubenswrapper[4919]: I0310 22:13:27.867745 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerStarted","Data":"3ff943dd9d7d072cf3d2d3c2ad0e040ce352e3ee1431449fadafb8f84cb980f6"} Mar 10 22:13:27 crc kubenswrapper[4919]: I0310 22:13:27.871247 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" event={"ID":"0a8d8a3d-169b-4fea-9848-b8998625b1d2","Type":"ContainerStarted","Data":"046aa9ef267aea44a0077d8321ee3d8194793ad759fea7abc91c42878cf2fddd"} Mar 10 22:13:30 crc kubenswrapper[4919]: I0310 22:13:30.068545 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" podStartSLOduration=4.314859104 podStartE2EDuration="13.068523333s" podCreationTimestamp="2026-03-10 22:13:17 +0000 UTC" firstStartedPulling="2026-03-10 22:13:18.658426572 +0000 UTC m=+1385.900307180" lastFinishedPulling="2026-03-10 22:13:27.412090801 +0000 UTC m=+1394.653971409" observedRunningTime="2026-03-10 22:13:27.890486073 +0000 UTC m=+1395.132366681" watchObservedRunningTime="2026-03-10 22:13:30.068523333 +0000 UTC m=+1397.310403941" Mar 10 22:13:30 crc kubenswrapper[4919]: I0310 22:13:30.076869 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:13:30 crc kubenswrapper[4919]: I0310 22:13:30.077212 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerName="glance-log" containerID="cri-o://bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6" gracePeriod=30 Mar 10 22:13:30 crc kubenswrapper[4919]: I0310 22:13:30.077538 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerName="glance-httpd" containerID="cri-o://c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24" gracePeriod=30 Mar 10 22:13:30 crc kubenswrapper[4919]: I0310 22:13:30.899017 4919 generic.go:334] "Generic (PLEG): container finished" podID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerID="bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6" exitCode=143 Mar 10 22:13:30 crc kubenswrapper[4919]: I0310 22:13:30.899123 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09d4bc6e-4f9e-4375-a816-2aad9cf376b2","Type":"ContainerDied","Data":"bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6"} Mar 10 22:13:31 crc kubenswrapper[4919]: I0310 22:13:31.910943 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerStarted","Data":"abd8241dbed51bcd7e985f544a549f95a3a38a90171c79ca5e36392c8e6978de"} Mar 10 22:13:31 crc kubenswrapper[4919]: I0310 22:13:31.912214 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="ceilometer-central-agent" containerID="cri-o://d663d84dc97fb469b9aa16cb833c1c32c425dd5e6f6b94be4acbd8b521b493f3" gracePeriod=30 Mar 10 22:13:31 crc kubenswrapper[4919]: I0310 22:13:31.912405 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 22:13:31 crc kubenswrapper[4919]: I0310 22:13:31.912856 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="proxy-httpd" containerID="cri-o://abd8241dbed51bcd7e985f544a549f95a3a38a90171c79ca5e36392c8e6978de" gracePeriod=30 Mar 10 22:13:31 crc kubenswrapper[4919]: I0310 22:13:31.912983 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="sg-core" containerID="cri-o://3ff943dd9d7d072cf3d2d3c2ad0e040ce352e3ee1431449fadafb8f84cb980f6" gracePeriod=30 Mar 10 22:13:31 crc kubenswrapper[4919]: I0310 22:13:31.913090 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="ceilometer-notification-agent" containerID="cri-o://1a23096ba1de1649229865bb2ad45b1f791571f42d8d38b1370e90773a9e66db" gracePeriod=30 Mar 10 22:13:31 crc kubenswrapper[4919]: I0310 22:13:31.944558 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.02631443 podStartE2EDuration="11.94338383s" podCreationTimestamp="2026-03-10 22:13:20 +0000 UTC" firstStartedPulling="2026-03-10 22:13:20.963657389 +0000 UTC m=+1388.205537997" lastFinishedPulling="2026-03-10 22:13:30.880726789 +0000 UTC m=+1398.122607397" observedRunningTime="2026-03-10 22:13:31.933122912 +0000 UTC m=+1399.175003540" watchObservedRunningTime="2026-03-10 22:13:31.94338383 +0000 UTC m=+1399.185264438" Mar 10 22:13:32 crc kubenswrapper[4919]: E0310 22:13:32.475935 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1414266b_7e10_49f9_8a97_3fe6238cd61c.slice/crio-d663d84dc97fb469b9aa16cb833c1c32c425dd5e6f6b94be4acbd8b521b493f3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1414266b_7e10_49f9_8a97_3fe6238cd61c.slice/crio-1a23096ba1de1649229865bb2ad45b1f791571f42d8d38b1370e90773a9e66db.scope\": RecentStats: unable to find data in memory cache]" Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.577637 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.578180 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerName="glance-log" containerID="cri-o://b4663f7d7e0ab4279572096b43ee1b65d11f4de19b52fe14f7d2d0fbaf38a65d" gracePeriod=30 Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.578350 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerName="glance-httpd" containerID="cri-o://2734d9e2168cf611b701bf7332456b621e4b8962de1b13492fc28d84fb7815b5" gracePeriod=30 Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.922784 4919 generic.go:334] "Generic (PLEG): container finished" podID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerID="abd8241dbed51bcd7e985f544a549f95a3a38a90171c79ca5e36392c8e6978de" exitCode=0 Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.922814 4919 generic.go:334] "Generic (PLEG): container finished" podID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerID="3ff943dd9d7d072cf3d2d3c2ad0e040ce352e3ee1431449fadafb8f84cb980f6" exitCode=2 Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.922834 4919 generic.go:334] "Generic (PLEG): container finished" podID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerID="1a23096ba1de1649229865bb2ad45b1f791571f42d8d38b1370e90773a9e66db" exitCode=0 Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.922841 4919 generic.go:334] "Generic (PLEG): container finished" podID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerID="d663d84dc97fb469b9aa16cb833c1c32c425dd5e6f6b94be4acbd8b521b493f3" exitCode=0 Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.922877 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerDied","Data":"abd8241dbed51bcd7e985f544a549f95a3a38a90171c79ca5e36392c8e6978de"} Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.922902 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerDied","Data":"3ff943dd9d7d072cf3d2d3c2ad0e040ce352e3ee1431449fadafb8f84cb980f6"} Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.922911 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerDied","Data":"1a23096ba1de1649229865bb2ad45b1f791571f42d8d38b1370e90773a9e66db"} Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.922920 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerDied","Data":"d663d84dc97fb469b9aa16cb833c1c32c425dd5e6f6b94be4acbd8b521b493f3"} Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.925831 4919 generic.go:334] "Generic (PLEG): container finished" podID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerID="b4663f7d7e0ab4279572096b43ee1b65d11f4de19b52fe14f7d2d0fbaf38a65d" exitCode=143 Mar 10 22:13:32 crc kubenswrapper[4919]: I0310 22:13:32.925857 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7050d40c-b959-48a8-b21f-b9f5e308c920","Type":"ContainerDied","Data":"b4663f7d7e0ab4279572096b43ee1b65d11f4de19b52fe14f7d2d0fbaf38a65d"} Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.390545 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.403418 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-log-httpd\") pod \"1414266b-7e10-49f9-8a97-3fe6238cd61c\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.403535 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-sg-core-conf-yaml\") pod \"1414266b-7e10-49f9-8a97-3fe6238cd61c\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.403562 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-run-httpd\") pod \"1414266b-7e10-49f9-8a97-3fe6238cd61c\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.403624 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4qpr\" (UniqueName: \"kubernetes.io/projected/1414266b-7e10-49f9-8a97-3fe6238cd61c-kube-api-access-n4qpr\") pod \"1414266b-7e10-49f9-8a97-3fe6238cd61c\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.403700 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-config-data\") pod \"1414266b-7e10-49f9-8a97-3fe6238cd61c\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.403779 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-combined-ca-bundle\") pod \"1414266b-7e10-49f9-8a97-3fe6238cd61c\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.403850 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-scripts\") pod \"1414266b-7e10-49f9-8a97-3fe6238cd61c\" (UID: \"1414266b-7e10-49f9-8a97-3fe6238cd61c\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.403862 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1414266b-7e10-49f9-8a97-3fe6238cd61c" (UID: "1414266b-7e10-49f9-8a97-3fe6238cd61c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.404226 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.404283 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1414266b-7e10-49f9-8a97-3fe6238cd61c" (UID: "1414266b-7e10-49f9-8a97-3fe6238cd61c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.412558 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-scripts" (OuterVolumeSpecName: "scripts") pod "1414266b-7e10-49f9-8a97-3fe6238cd61c" (UID: "1414266b-7e10-49f9-8a97-3fe6238cd61c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.427672 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1414266b-7e10-49f9-8a97-3fe6238cd61c-kube-api-access-n4qpr" (OuterVolumeSpecName: "kube-api-access-n4qpr") pod "1414266b-7e10-49f9-8a97-3fe6238cd61c" (UID: "1414266b-7e10-49f9-8a97-3fe6238cd61c"). InnerVolumeSpecName "kube-api-access-n4qpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.451502 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1414266b-7e10-49f9-8a97-3fe6238cd61c" (UID: "1414266b-7e10-49f9-8a97-3fe6238cd61c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.506349 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.506469 4919 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.506487 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1414266b-7e10-49f9-8a97-3fe6238cd61c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.506501 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4qpr\" (UniqueName: \"kubernetes.io/projected/1414266b-7e10-49f9-8a97-3fe6238cd61c-kube-api-access-n4qpr\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.542653 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1414266b-7e10-49f9-8a97-3fe6238cd61c" (UID: "1414266b-7e10-49f9-8a97-3fe6238cd61c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.583093 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-config-data" (OuterVolumeSpecName: "config-data") pod "1414266b-7e10-49f9-8a97-3fe6238cd61c" (UID: "1414266b-7e10-49f9-8a97-3fe6238cd61c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.608285 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.608321 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1414266b-7e10-49f9-8a97-3fe6238cd61c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.725035 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.916865 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-httpd-run\") pod \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.917818 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-scripts\") pod \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.917903 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-public-tls-certs\") pod \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.917458 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "09d4bc6e-4f9e-4375-a816-2aad9cf376b2" (UID: "09d4bc6e-4f9e-4375-a816-2aad9cf376b2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.918220 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-logs\") pod \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.918321 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-config-data\") pod \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.918462 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg8ck\" (UniqueName: \"kubernetes.io/projected/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-kube-api-access-pg8ck\") pod \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.918600 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.918753 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-combined-ca-bundle\") pod \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\" (UID: \"09d4bc6e-4f9e-4375-a816-2aad9cf376b2\") " Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.919683 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-logs" (OuterVolumeSpecName: "logs") pod "09d4bc6e-4f9e-4375-a816-2aad9cf376b2" (UID: "09d4bc6e-4f9e-4375-a816-2aad9cf376b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.921569 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-scripts" (OuterVolumeSpecName: "scripts") pod "09d4bc6e-4f9e-4375-a816-2aad9cf376b2" (UID: "09d4bc6e-4f9e-4375-a816-2aad9cf376b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.921673 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.921694 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.922469 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-kube-api-access-pg8ck" (OuterVolumeSpecName: "kube-api-access-pg8ck") pod "09d4bc6e-4f9e-4375-a816-2aad9cf376b2" (UID: "09d4bc6e-4f9e-4375-a816-2aad9cf376b2"). InnerVolumeSpecName "kube-api-access-pg8ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.924763 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "09d4bc6e-4f9e-4375-a816-2aad9cf376b2" (UID: "09d4bc6e-4f9e-4375-a816-2aad9cf376b2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.940362 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1414266b-7e10-49f9-8a97-3fe6238cd61c","Type":"ContainerDied","Data":"a257ec7adcb6a4da68990b22f8e8a3e35d3c6e045d3438c7c2f16c5507d6673f"} Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.940435 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.940467 4919 scope.go:117] "RemoveContainer" containerID="abd8241dbed51bcd7e985f544a549f95a3a38a90171c79ca5e36392c8e6978de" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.947246 4919 generic.go:334] "Generic (PLEG): container finished" podID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerID="c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24" exitCode=0 Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.947313 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09d4bc6e-4f9e-4375-a816-2aad9cf376b2","Type":"ContainerDied","Data":"c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24"} Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.947353 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09d4bc6e-4f9e-4375-a816-2aad9cf376b2","Type":"ContainerDied","Data":"d0d0d91e4f048967e856ec9ab78e4ae0091d82ddfdeceab13d97dd1e341fe97b"} Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.947666 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.976924 4919 scope.go:117] "RemoveContainer" containerID="3ff943dd9d7d072cf3d2d3c2ad0e040ce352e3ee1431449fadafb8f84cb980f6" Mar 10 22:13:33 crc kubenswrapper[4919]: I0310 22:13:33.988949 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.004578 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.007252 4919 scope.go:117] "RemoveContainer" containerID="1a23096ba1de1649229865bb2ad45b1f791571f42d8d38b1370e90773a9e66db" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.015375 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.015894 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerName="neutron-api" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.015919 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerName="neutron-api" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.015944 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerName="glance-log" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.015953 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerName="glance-log" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.015964 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="ceilometer-notification-agent" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.015971 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="ceilometer-notification-agent" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.015986 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerName="glance-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.015995 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerName="glance-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.016013 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerName="neutron-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016019 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerName="neutron-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.016033 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="proxy-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016041 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="proxy-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.016056 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="sg-core" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016062 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="sg-core" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.016080 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="ceilometer-central-agent" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016086 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="ceilometer-central-agent" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016272 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="ceilometer-central-agent" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016283 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerName="neutron-api" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016300 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerName="glance-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016313 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe9eaba-0336-4655-b2d9-9bd67261da54" containerName="neutron-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016329 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="sg-core" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016340 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="ceilometer-notification-agent" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016352 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" containerName="glance-log" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.016364 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" containerName="proxy-httpd" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.018319 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.022849 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.022896 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg8ck\" (UniqueName: \"kubernetes.io/projected/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-kube-api-access-pg8ck\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.022922 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.023553 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.025031 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.025348 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.043063 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.124364 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-config-data\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.124410 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-scripts\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.124438 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-run-httpd\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.124454 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-log-httpd\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.124500 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.124587 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.124603 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9wjx\" (UniqueName: \"kubernetes.io/projected/185a485d-24ea-44f8-bac9-0f4ddc4298ac-kube-api-access-h9wjx\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.124643 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.179179 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09d4bc6e-4f9e-4375-a816-2aad9cf376b2" (UID: "09d4bc6e-4f9e-4375-a816-2aad9cf376b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.210551 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-config-data" (OuterVolumeSpecName: "config-data") pod "09d4bc6e-4f9e-4375-a816-2aad9cf376b2" (UID: "09d4bc6e-4f9e-4375-a816-2aad9cf376b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.216823 4919 scope.go:117] "RemoveContainer" containerID="d663d84dc97fb469b9aa16cb833c1c32c425dd5e6f6b94be4acbd8b521b493f3" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.226558 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.226745 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.226790 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9wjx\" (UniqueName: \"kubernetes.io/projected/185a485d-24ea-44f8-bac9-0f4ddc4298ac-kube-api-access-h9wjx\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.226830 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-config-data\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.226868 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-scripts\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.226897 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-run-httpd\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.226913 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-log-httpd\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.226995 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.227008 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.227629 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-log-httpd\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.227948 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-run-httpd\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.231125 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.232282 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-config-data\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.232923 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.234024 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-scripts\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.244905 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9wjx\" (UniqueName: \"kubernetes.io/projected/185a485d-24ea-44f8-bac9-0f4ddc4298ac-kube-api-access-h9wjx\") pod \"ceilometer-0\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.254249 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "09d4bc6e-4f9e-4375-a816-2aad9cf376b2" (UID: "09d4bc6e-4f9e-4375-a816-2aad9cf376b2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.319097 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.326173 4919 scope.go:117] "RemoveContainer" containerID="c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.328791 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09d4bc6e-4f9e-4375-a816-2aad9cf376b2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.378562 4919 scope.go:117] "RemoveContainer" containerID="bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.442945 4919 scope.go:117] "RemoveContainer" containerID="c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.445830 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24\": container with ID starting with c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24 not found: ID does not exist" containerID="c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.445865 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24"} err="failed to get container status \"c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24\": rpc error: code = NotFound desc = could not find container \"c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24\": container with ID starting with c24d19b94dafe98f4ece853e6e12aa2d60a40b12a18aa5a70a19ec640535ba24 not found: ID does not exist" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.445920 4919 scope.go:117] "RemoveContainer" containerID="bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6" Mar 10 22:13:34 crc kubenswrapper[4919]: E0310 22:13:34.449911 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6\": container with ID starting with bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6 not found: ID does not exist" containerID="bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.449962 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6"} err="failed to get container status \"bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6\": rpc error: code = NotFound desc = could not find container \"bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6\": container with ID starting with bb22b6b86c9aa2a98d6e5696308d32888d151cf860adf39c4cb0d518c228adb6 not found: ID does not exist" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.602784 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.621013 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.630531 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.632266 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.638351 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.638740 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.640602 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.791726 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.835372 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-logs\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.835726 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.835759 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.835780 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.835815 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.835852 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.835886 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.836098 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2rj\" (UniqueName: \"kubernetes.io/projected/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-kube-api-access-4t2rj\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.937530 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.937580 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.937598 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.937635 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.937679 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.937716 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.937756 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2rj\" (UniqueName: \"kubernetes.io/projected/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-kube-api-access-4t2rj\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.937833 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-logs\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.938023 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.938493 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-logs\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.938829 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.943342 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.943797 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.943984 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.945039 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.954886 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2rj\" (UniqueName: \"kubernetes.io/projected/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-kube-api-access-4t2rj\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.960098 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerStarted","Data":"6287dc502f96de7ba8ee9642b48eb9b3fceaf41d386f0f2aada2a7d0d4b8ce1b"} Mar 10 22:13:34 crc kubenswrapper[4919]: I0310 22:13:34.965403 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " pod="openstack/glance-default-external-api-0" Mar 10 22:13:35 crc kubenswrapper[4919]: I0310 22:13:35.258802 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:13:35 crc kubenswrapper[4919]: I0310 22:13:35.500699 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d4bc6e-4f9e-4375-a816-2aad9cf376b2" path="/var/lib/kubelet/pods/09d4bc6e-4f9e-4375-a816-2aad9cf376b2/volumes" Mar 10 22:13:35 crc kubenswrapper[4919]: I0310 22:13:35.501745 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1414266b-7e10-49f9-8a97-3fe6238cd61c" path="/var/lib/kubelet/pods/1414266b-7e10-49f9-8a97-3fe6238cd61c/volumes" Mar 10 22:13:35 crc kubenswrapper[4919]: I0310 22:13:35.910612 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.060182 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerStarted","Data":"41cea86b72e1810ac1c93113c6c7092dda1306995f03af03db835ebbae38f192"} Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.063233 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"91a933f1-aa44-4375-8f5c-e5f3567e6c8e","Type":"ContainerStarted","Data":"952015c9db412080f67ce57047b4d32bb65eacfc5d636344933b226c8847b438"} Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.070886 4919 generic.go:334] "Generic (PLEG): container finished" podID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerID="2734d9e2168cf611b701bf7332456b621e4b8962de1b13492fc28d84fb7815b5" exitCode=0 Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.070937 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7050d40c-b959-48a8-b21f-b9f5e308c920","Type":"ContainerDied","Data":"2734d9e2168cf611b701bf7332456b621e4b8962de1b13492fc28d84fb7815b5"} Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.473809 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.566926 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-internal-tls-certs\") pod \"7050d40c-b959-48a8-b21f-b9f5e308c920\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.567053 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-config-data\") pod \"7050d40c-b959-48a8-b21f-b9f5e308c920\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.567090 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qhgn\" (UniqueName: \"kubernetes.io/projected/7050d40c-b959-48a8-b21f-b9f5e308c920-kube-api-access-2qhgn\") pod \"7050d40c-b959-48a8-b21f-b9f5e308c920\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.567124 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-combined-ca-bundle\") pod \"7050d40c-b959-48a8-b21f-b9f5e308c920\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.575627 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7050d40c-b959-48a8-b21f-b9f5e308c920-kube-api-access-2qhgn" (OuterVolumeSpecName: "kube-api-access-2qhgn") pod "7050d40c-b959-48a8-b21f-b9f5e308c920" (UID: "7050d40c-b959-48a8-b21f-b9f5e308c920"). InnerVolumeSpecName "kube-api-access-2qhgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.619188 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7050d40c-b959-48a8-b21f-b9f5e308c920" (UID: "7050d40c-b959-48a8-b21f-b9f5e308c920"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.627859 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-config-data" (OuterVolumeSpecName: "config-data") pod "7050d40c-b959-48a8-b21f-b9f5e308c920" (UID: "7050d40c-b959-48a8-b21f-b9f5e308c920"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.647521 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7050d40c-b959-48a8-b21f-b9f5e308c920" (UID: "7050d40c-b959-48a8-b21f-b9f5e308c920"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.669183 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-httpd-run\") pod \"7050d40c-b959-48a8-b21f-b9f5e308c920\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.669413 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7050d40c-b959-48a8-b21f-b9f5e308c920\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.669518 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7050d40c-b959-48a8-b21f-b9f5e308c920" (UID: "7050d40c-b959-48a8-b21f-b9f5e308c920"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.669626 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-scripts\") pod \"7050d40c-b959-48a8-b21f-b9f5e308c920\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.669742 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-logs\") pod \"7050d40c-b959-48a8-b21f-b9f5e308c920\" (UID: \"7050d40c-b959-48a8-b21f-b9f5e308c920\") " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.670213 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.670488 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qhgn\" (UniqueName: \"kubernetes.io/projected/7050d40c-b959-48a8-b21f-b9f5e308c920-kube-api-access-2qhgn\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.670582 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.670964 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.671455 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.672406 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-logs" (OuterVolumeSpecName: "logs") pod "7050d40c-b959-48a8-b21f-b9f5e308c920" (UID: "7050d40c-b959-48a8-b21f-b9f5e308c920"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.672977 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "7050d40c-b959-48a8-b21f-b9f5e308c920" (UID: "7050d40c-b959-48a8-b21f-b9f5e308c920"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.678622 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-scripts" (OuterVolumeSpecName: "scripts") pod "7050d40c-b959-48a8-b21f-b9f5e308c920" (UID: "7050d40c-b959-48a8-b21f-b9f5e308c920"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.775978 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.776566 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7050d40c-b959-48a8-b21f-b9f5e308c920-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.776670 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7050d40c-b959-48a8-b21f-b9f5e308c920-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.797457 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 22:13:36 crc kubenswrapper[4919]: I0310 22:13:36.878541 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.085331 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"91a933f1-aa44-4375-8f5c-e5f3567e6c8e","Type":"ContainerStarted","Data":"ceb6023d0f542d943ccfa4398a55aeeb75cf652b0f4a2b2be0237840184075d5"} Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.088680 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7050d40c-b959-48a8-b21f-b9f5e308c920","Type":"ContainerDied","Data":"7afed62878906dabbbce9d8f9449b8bf93815c65c8d53a0cece7e61de2b5794b"} Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.088841 4919 scope.go:117] "RemoveContainer" containerID="2734d9e2168cf611b701bf7332456b621e4b8962de1b13492fc28d84fb7815b5" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.088723 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.096954 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerStarted","Data":"05c3a5f2993798c334cb01686ebbd4d57cdfe8d58fe0a3ba8802b484208f0600"} Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.139585 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.154838 4919 scope.go:117] "RemoveContainer" containerID="b4663f7d7e0ab4279572096b43ee1b65d11f4de19b52fe14f7d2d0fbaf38a65d" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.157227 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.175501 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:13:37 crc kubenswrapper[4919]: E0310 22:13:37.176052 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerName="glance-httpd" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.176164 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerName="glance-httpd" Mar 10 22:13:37 crc kubenswrapper[4919]: E0310 22:13:37.176268 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerName="glance-log" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.176339 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerName="glance-log" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.176690 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerName="glance-log" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.176779 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" containerName="glance-httpd" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.178026 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.182120 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.183311 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.216912 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.290077 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.290140 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.290195 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.290227 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8msqp\" (UniqueName: \"kubernetes.io/projected/ab479995-b87a-46b8-9a4e-d9e95d556775-kube-api-access-8msqp\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.290263 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.290303 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.290345 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.290377 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.394522 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.394871 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.394903 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.394948 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.394982 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.395030 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.395090 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8msqp\" (UniqueName: \"kubernetes.io/projected/ab479995-b87a-46b8-9a4e-d9e95d556775-kube-api-access-8msqp\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.395493 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.395489 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.395551 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.395649 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.398666 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.399938 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.406277 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.406686 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.420478 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8msqp\" (UniqueName: \"kubernetes.io/projected/ab479995-b87a-46b8-9a4e-d9e95d556775-kube-api-access-8msqp\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.444200 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " pod="openstack/glance-default-internal-api-0" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.508269 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7050d40c-b959-48a8-b21f-b9f5e308c920" path="/var/lib/kubelet/pods/7050d40c-b959-48a8-b21f-b9f5e308c920/volumes" Mar 10 22:13:37 crc kubenswrapper[4919]: I0310 22:13:37.589528 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:38 crc kubenswrapper[4919]: W0310 22:13:38.115507 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab479995_b87a_46b8_9a4e_d9e95d556775.slice/crio-98e94965efcd8e804b4844271d3615e1dbc917b10cf7d2bc677998fc1d6a9654 WatchSource:0}: Error finding container 98e94965efcd8e804b4844271d3615e1dbc917b10cf7d2bc677998fc1d6a9654: Status 404 returned error can't find the container with id 98e94965efcd8e804b4844271d3615e1dbc917b10cf7d2bc677998fc1d6a9654 Mar 10 22:13:38 crc kubenswrapper[4919]: I0310 22:13:38.115811 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"91a933f1-aa44-4375-8f5c-e5f3567e6c8e","Type":"ContainerStarted","Data":"79e87bdd987eb81ea9f7ad47745afc59d0dd4ce7a69aa1af13ca054411b4739c"} Mar 10 22:13:38 crc kubenswrapper[4919]: I0310 22:13:38.120897 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:13:38 crc kubenswrapper[4919]: I0310 22:13:38.135274 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerStarted","Data":"371d3157e9bbdd52091031696e44edff636a803913a025a8a5308cf9bd2a47de"} Mar 10 22:13:38 crc kubenswrapper[4919]: I0310 22:13:38.140009 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.139976969 podStartE2EDuration="4.139976969s" podCreationTimestamp="2026-03-10 22:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:38.13633327 +0000 UTC m=+1405.378213888" watchObservedRunningTime="2026-03-10 22:13:38.139976969 +0000 UTC m=+1405.381857587" Mar 10 22:13:39 crc kubenswrapper[4919]: I0310 22:13:39.152664 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab479995-b87a-46b8-9a4e-d9e95d556775","Type":"ContainerStarted","Data":"98e94965efcd8e804b4844271d3615e1dbc917b10cf7d2bc677998fc1d6a9654"} Mar 10 22:13:40 crc kubenswrapper[4919]: I0310 22:13:40.179035 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerStarted","Data":"a127a403f434a8ffe4fb7b043199f72b41de6f08174a50dae3e141aceebcc45f"} Mar 10 22:13:40 crc kubenswrapper[4919]: I0310 22:13:40.179562 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 22:13:40 crc kubenswrapper[4919]: I0310 22:13:40.181530 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab479995-b87a-46b8-9a4e-d9e95d556775","Type":"ContainerStarted","Data":"0fa75078e11e8939f0a39526b2508ccb8e7f4c3ea23641588f1b3b509c8c8e82"} Mar 10 22:13:40 crc kubenswrapper[4919]: I0310 22:13:40.211244 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.318569457 podStartE2EDuration="7.211220336s" podCreationTimestamp="2026-03-10 22:13:33 +0000 UTC" firstStartedPulling="2026-03-10 22:13:34.79715083 +0000 UTC m=+1402.039031438" lastFinishedPulling="2026-03-10 22:13:39.689801709 +0000 UTC m=+1406.931682317" observedRunningTime="2026-03-10 22:13:40.201273057 +0000 UTC m=+1407.443153665" watchObservedRunningTime="2026-03-10 22:13:40.211220336 +0000 UTC m=+1407.453100964" Mar 10 22:13:41 crc kubenswrapper[4919]: I0310 22:13:41.195158 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab479995-b87a-46b8-9a4e-d9e95d556775","Type":"ContainerStarted","Data":"22265554a653026f7008b3a597b22efe9ebe95b2013255f792f08efd3682fc62"} Mar 10 22:13:41 crc kubenswrapper[4919]: I0310 22:13:41.220669 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.220644485 podStartE2EDuration="4.220644485s" podCreationTimestamp="2026-03-10 22:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:41.217158161 +0000 UTC m=+1408.459038769" watchObservedRunningTime="2026-03-10 22:13:41.220644485 +0000 UTC m=+1408.462525093" Mar 10 22:13:45 crc kubenswrapper[4919]: I0310 22:13:45.259266 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 22:13:45 crc kubenswrapper[4919]: I0310 22:13:45.259842 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 22:13:45 crc kubenswrapper[4919]: I0310 22:13:45.306369 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 22:13:45 crc kubenswrapper[4919]: I0310 22:13:45.323413 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 22:13:46 crc kubenswrapper[4919]: I0310 22:13:46.239288 4919 generic.go:334] "Generic (PLEG): container finished" podID="0a8d8a3d-169b-4fea-9848-b8998625b1d2" containerID="046aa9ef267aea44a0077d8321ee3d8194793ad759fea7abc91c42878cf2fddd" exitCode=0 Mar 10 22:13:46 crc kubenswrapper[4919]: I0310 22:13:46.239401 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" event={"ID":"0a8d8a3d-169b-4fea-9848-b8998625b1d2","Type":"ContainerDied","Data":"046aa9ef267aea44a0077d8321ee3d8194793ad759fea7abc91c42878cf2fddd"} Mar 10 22:13:46 crc kubenswrapper[4919]: I0310 22:13:46.239652 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 22:13:46 crc kubenswrapper[4919]: I0310 22:13:46.239707 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.590226 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.590626 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.624785 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.643531 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.687017 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.794903 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-scripts\") pod \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.795191 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-combined-ca-bundle\") pod \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.795377 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-config-data\") pod \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.795578 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c4dk\" (UniqueName: \"kubernetes.io/projected/0a8d8a3d-169b-4fea-9848-b8998625b1d2-kube-api-access-2c4dk\") pod \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\" (UID: \"0a8d8a3d-169b-4fea-9848-b8998625b1d2\") " Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.805750 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-scripts" (OuterVolumeSpecName: "scripts") pod "0a8d8a3d-169b-4fea-9848-b8998625b1d2" (UID: "0a8d8a3d-169b-4fea-9848-b8998625b1d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.812492 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8d8a3d-169b-4fea-9848-b8998625b1d2-kube-api-access-2c4dk" (OuterVolumeSpecName: "kube-api-access-2c4dk") pod "0a8d8a3d-169b-4fea-9848-b8998625b1d2" (UID: "0a8d8a3d-169b-4fea-9848-b8998625b1d2"). InnerVolumeSpecName "kube-api-access-2c4dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.822327 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a8d8a3d-169b-4fea-9848-b8998625b1d2" (UID: "0a8d8a3d-169b-4fea-9848-b8998625b1d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.824211 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-config-data" (OuterVolumeSpecName: "config-data") pod "0a8d8a3d-169b-4fea-9848-b8998625b1d2" (UID: "0a8d8a3d-169b-4fea-9848-b8998625b1d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.897493 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c4dk\" (UniqueName: \"kubernetes.io/projected/0a8d8a3d-169b-4fea-9848-b8998625b1d2-kube-api-access-2c4dk\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.897523 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.897536 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:47 crc kubenswrapper[4919]: I0310 22:13:47.897547 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a8d8a3d-169b-4fea-9848-b8998625b1d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.090310 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.162025 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.277554 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" event={"ID":"0a8d8a3d-169b-4fea-9848-b8998625b1d2","Type":"ContainerDied","Data":"d893a4e6247512b560e8911e765b5fc93f7044209782fb5a2d0fd82b136a4547"} Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.277626 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d893a4e6247512b560e8911e765b5fc93f7044209782fb5a2d0fd82b136a4547" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.278302 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.278340 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.278482 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zr9cc" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.384352 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 22:13:48 crc kubenswrapper[4919]: E0310 22:13:48.384919 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8d8a3d-169b-4fea-9848-b8998625b1d2" containerName="nova-cell0-conductor-db-sync" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.384942 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8d8a3d-169b-4fea-9848-b8998625b1d2" containerName="nova-cell0-conductor-db-sync" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.385150 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8d8a3d-169b-4fea-9848-b8998625b1d2" containerName="nova-cell0-conductor-db-sync" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.386234 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.397566 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.400369 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m77rf" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.401136 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.506677 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdfp5\" (UniqueName: \"kubernetes.io/projected/4a7ad3ed-9144-4a21-808c-23d613354a2f-kube-api-access-sdfp5\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.507038 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.507069 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.608930 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.609002 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.609191 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdfp5\" (UniqueName: \"kubernetes.io/projected/4a7ad3ed-9144-4a21-808c-23d613354a2f-kube-api-access-sdfp5\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.620137 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.623052 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.641085 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdfp5\" (UniqueName: \"kubernetes.io/projected/4a7ad3ed-9144-4a21-808c-23d613354a2f-kube-api-access-sdfp5\") pod \"nova-cell0-conductor-0\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:48 crc kubenswrapper[4919]: I0310 22:13:48.737528 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:49 crc kubenswrapper[4919]: I0310 22:13:49.228069 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 22:13:49 crc kubenswrapper[4919]: W0310 22:13:49.241100 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a7ad3ed_9144_4a21_808c_23d613354a2f.slice/crio-3c1cec38dad6b500fe1db67b824dc4b2fa037a697e7256d18f0d6bb402e4e332 WatchSource:0}: Error finding container 3c1cec38dad6b500fe1db67b824dc4b2fa037a697e7256d18f0d6bb402e4e332: Status 404 returned error can't find the container with id 3c1cec38dad6b500fe1db67b824dc4b2fa037a697e7256d18f0d6bb402e4e332 Mar 10 22:13:49 crc kubenswrapper[4919]: I0310 22:13:49.287161 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4a7ad3ed-9144-4a21-808c-23d613354a2f","Type":"ContainerStarted","Data":"3c1cec38dad6b500fe1db67b824dc4b2fa037a697e7256d18f0d6bb402e4e332"} Mar 10 22:13:50 crc kubenswrapper[4919]: I0310 22:13:50.159205 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:50 crc kubenswrapper[4919]: I0310 22:13:50.215834 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 22:13:50 crc kubenswrapper[4919]: I0310 22:13:50.307702 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4a7ad3ed-9144-4a21-808c-23d613354a2f","Type":"ContainerStarted","Data":"9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4"} Mar 10 22:13:50 crc kubenswrapper[4919]: I0310 22:13:50.334607 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.334590477 podStartE2EDuration="2.334590477s" podCreationTimestamp="2026-03-10 22:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:13:50.328921442 +0000 UTC m=+1417.570802050" watchObservedRunningTime="2026-03-10 22:13:50.334590477 +0000 UTC m=+1417.576471085" Mar 10 22:13:51 crc kubenswrapper[4919]: I0310 22:13:51.315447 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:58 crc kubenswrapper[4919]: I0310 22:13:58.772635 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.256062 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-tkhg8"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.257223 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.264992 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tkhg8"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.284946 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.285399 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.309304 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.309367 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-scripts\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.309485 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-config-data\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.309507 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrrm9\" (UniqueName: \"kubernetes.io/projected/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-kube-api-access-vrrm9\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.410981 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrrm9\" (UniqueName: \"kubernetes.io/projected/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-kube-api-access-vrrm9\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.411031 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-config-data\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.411153 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.411188 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-scripts\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.423205 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-config-data\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.424009 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.433971 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-scripts\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.459540 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrrm9\" (UniqueName: \"kubernetes.io/projected/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-kube-api-access-vrrm9\") pod \"nova-cell0-cell-mapping-tkhg8\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.520848 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.522703 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.526652 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.539956 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.595263 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.602595 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.614781 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.616289 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-config-data\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.615288 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.616614 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtv4\" (UniqueName: \"kubernetes.io/projected/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-kube-api-access-pmtv4\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.615762 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.637361 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.713488 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.715319 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.719574 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a194fe14-439b-4a8a-acb8-ea5852e0721b-logs\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.719652 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-config-data\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.719722 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.719746 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-config-data\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.719796 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.719846 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fwlj\" (UniqueName: \"kubernetes.io/projected/a194fe14-439b-4a8a-acb8-ea5852e0721b-kube-api-access-6fwlj\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.719871 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtv4\" (UniqueName: \"kubernetes.io/projected/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-kube-api-access-pmtv4\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.731666 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-f9vxh"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.733942 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.749390 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.763503 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.767042 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-config-data\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.769226 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.790141 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtv4\" (UniqueName: \"kubernetes.io/projected/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-kube-api-access-pmtv4\") pod \"nova-scheduler-0\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.813569 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-f9vxh"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821147 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-config-data\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821217 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821242 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821302 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fwlj\" (UniqueName: \"kubernetes.io/projected/a194fe14-439b-4a8a-acb8-ea5852e0721b-kube-api-access-6fwlj\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821334 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-config\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821414 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qw4h\" (UniqueName: \"kubernetes.io/projected/c696db8f-44d0-42ad-aa56-5e889eef767f-kube-api-access-5qw4h\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821434 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-nb\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821463 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a194fe14-439b-4a8a-acb8-ea5852e0721b-logs\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821499 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-sb\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821515 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-svc\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821547 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-config-data\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821593 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqtm2\" (UniqueName: \"kubernetes.io/projected/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-kube-api-access-jqtm2\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821617 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-swift-storage-0\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.821678 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-logs\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.827468 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a194fe14-439b-4a8a-acb8-ea5852e0721b-logs\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.839042 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-config-data\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.851861 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.874009 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.876113 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fwlj\" (UniqueName: \"kubernetes.io/projected/a194fe14-439b-4a8a-acb8-ea5852e0721b-kube-api-access-6fwlj\") pod \"nova-metadata-0\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928221 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qw4h\" (UniqueName: \"kubernetes.io/projected/c696db8f-44d0-42ad-aa56-5e889eef767f-kube-api-access-5qw4h\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928262 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-nb\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928291 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-sb\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928309 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-svc\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928341 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqtm2\" (UniqueName: \"kubernetes.io/projected/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-kube-api-access-jqtm2\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928359 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-swift-storage-0\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928397 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-logs\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928445 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-config-data\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928477 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.928522 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-config\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.932751 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-config\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.934490 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-nb\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.935020 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-sb\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.948292 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-svc\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.949254 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-swift-storage-0\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.949535 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-logs\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.955936 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.963504 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.964524 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-config-data\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.964746 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.971235 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.972896 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.975851 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:13:59 crc kubenswrapper[4919]: I0310 22:13:59.980151 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qw4h\" (UniqueName: \"kubernetes.io/projected/c696db8f-44d0-42ad-aa56-5e889eef767f-kube-api-access-5qw4h\") pod \"dnsmasq-dns-97cdf8549-f9vxh\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.006979 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqtm2\" (UniqueName: \"kubernetes.io/projected/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-kube-api-access-jqtm2\") pod \"nova-api-0\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " pod="openstack/nova-api-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.146267 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.146456 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.146488 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trd9\" (UniqueName: \"kubernetes.io/projected/40528b7c-2c67-4413-ba07-5e3e6af9c18b-kube-api-access-5trd9\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.171578 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553014-d5fgg"] Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.171850 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.173894 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.180805 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553014-d5fgg"] Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.187937 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.188019 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.188281 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.193044 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.248307 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.248357 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5trd9\" (UniqueName: \"kubernetes.io/projected/40528b7c-2c67-4413-ba07-5e3e6af9c18b-kube-api-access-5trd9\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.248452 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.253241 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.253533 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.270743 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trd9\" (UniqueName: \"kubernetes.io/projected/40528b7c-2c67-4413-ba07-5e3e6af9c18b-kube-api-access-5trd9\") pod \"nova-cell1-novncproxy-0\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.352463 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72gfd\" (UniqueName: \"kubernetes.io/projected/90cad2d4-b151-4000-82ee-fed894ad117a-kube-api-access-72gfd\") pod \"auto-csr-approver-29553014-d5fgg\" (UID: \"90cad2d4-b151-4000-82ee-fed894ad117a\") " pod="openshift-infra/auto-csr-approver-29553014-d5fgg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.386776 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.454955 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72gfd\" (UniqueName: \"kubernetes.io/projected/90cad2d4-b151-4000-82ee-fed894ad117a-kube-api-access-72gfd\") pod \"auto-csr-approver-29553014-d5fgg\" (UID: \"90cad2d4-b151-4000-82ee-fed894ad117a\") " pod="openshift-infra/auto-csr-approver-29553014-d5fgg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.484498 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72gfd\" (UniqueName: \"kubernetes.io/projected/90cad2d4-b151-4000-82ee-fed894ad117a-kube-api-access-72gfd\") pod \"auto-csr-approver-29553014-d5fgg\" (UID: \"90cad2d4-b151-4000-82ee-fed894ad117a\") " pod="openshift-infra/auto-csr-approver-29553014-d5fgg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.516395 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.621367 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tkhg8"] Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.781261 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.791014 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.813753 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lxbqg"] Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.816458 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.818919 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.819123 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.852108 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lxbqg"] Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.946216 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.963539 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-scripts\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.963591 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86bbb\" (UniqueName: \"kubernetes.io/projected/da024d80-ca57-41a6-b46a-508015462b2d-kube-api-access-86bbb\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.963652 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-config-data\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:00 crc kubenswrapper[4919]: I0310 22:14:00.963774 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.065833 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.066178 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-scripts\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.066207 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86bbb\" (UniqueName: \"kubernetes.io/projected/da024d80-ca57-41a6-b46a-508015462b2d-kube-api-access-86bbb\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.066856 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-config-data\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.071824 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-scripts\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.073981 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.075079 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-config-data\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.087128 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86bbb\" (UniqueName: \"kubernetes.io/projected/da024d80-ca57-41a6-b46a-508015462b2d-kube-api-access-86bbb\") pod \"nova-cell1-conductor-db-sync-lxbqg\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.128653 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-f9vxh"] Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.137194 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.241631 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553014-d5fgg"] Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.250804 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:14:01 crc kubenswrapper[4919]: W0310 22:14:01.297846 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40528b7c_2c67_4413_ba07_5e3e6af9c18b.slice/crio-14fc9c8f0059edfecaa123bb4639e1593c73c895fc6d4e9cd011404563d93cba WatchSource:0}: Error finding container 14fc9c8f0059edfecaa123bb4639e1593c73c895fc6d4e9cd011404563d93cba: Status 404 returned error can't find the container with id 14fc9c8f0059edfecaa123bb4639e1593c73c895fc6d4e9cd011404563d93cba Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.411824 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64","Type":"ContainerStarted","Data":"4f10c98e7f5ab9df7ceb8a470f6c457a9848d820fdca108c5f1df8a06b4a3361"} Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.414496 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b9dac23-a973-4e49-91bc-0a1c0ca4b998","Type":"ContainerStarted","Data":"dfc27311966fe11ec45ad0c6062d2eeeceb81dc5b761887c18ee173c88bdb02e"} Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.424534 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a194fe14-439b-4a8a-acb8-ea5852e0721b","Type":"ContainerStarted","Data":"4bbb82b02c3e7eeb0c6f9e4f3f975f2842975ced8b89f5486b1b4ea219d7c936"} Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.426790 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" event={"ID":"c696db8f-44d0-42ad-aa56-5e889eef767f","Type":"ContainerStarted","Data":"2318e4f3647fd69ecdd613964e273d02069cbc4fb038b1136fbad533fe7da705"} Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.428473 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"40528b7c-2c67-4413-ba07-5e3e6af9c18b","Type":"ContainerStarted","Data":"14fc9c8f0059edfecaa123bb4639e1593c73c895fc6d4e9cd011404563d93cba"} Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.429361 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tkhg8" event={"ID":"3a9a0463-5f7a-4164-9fd7-a7bce608bf41","Type":"ContainerStarted","Data":"fcb0215525c257f40fbb026eda215eee7e909386a54b513f6d4e594c4b7c8077"} Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.429414 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tkhg8" event={"ID":"3a9a0463-5f7a-4164-9fd7-a7bce608bf41","Type":"ContainerStarted","Data":"27914f5c02284e0dbaac986ce931c4e363b466dc24a9649d799ef876de6dab1b"} Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.432079 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" event={"ID":"90cad2d4-b151-4000-82ee-fed894ad117a","Type":"ContainerStarted","Data":"b96a3635614bac06a18b76764bea33b8a1270c4b87e2f8b5a8548409e6211882"} Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.456305 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-tkhg8" podStartSLOduration=2.456285125 podStartE2EDuration="2.456285125s" podCreationTimestamp="2026-03-10 22:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:01.451192027 +0000 UTC m=+1428.693072635" watchObservedRunningTime="2026-03-10 22:14:01.456285125 +0000 UTC m=+1428.698165733" Mar 10 22:14:01 crc kubenswrapper[4919]: I0310 22:14:01.613595 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lxbqg"] Mar 10 22:14:02 crc kubenswrapper[4919]: I0310 22:14:02.443519 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" event={"ID":"da024d80-ca57-41a6-b46a-508015462b2d","Type":"ContainerStarted","Data":"92cf38036c784cc198b471c45b92409cdc090c7ce641ef5b3d72746ab8ddf341"} Mar 10 22:14:02 crc kubenswrapper[4919]: I0310 22:14:02.444073 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" event={"ID":"da024d80-ca57-41a6-b46a-508015462b2d","Type":"ContainerStarted","Data":"c645d0a50ebfa46a9e6e806503e4b2d6c84235d386af51a86717920e70ac2a2e"} Mar 10 22:14:02 crc kubenswrapper[4919]: I0310 22:14:02.445732 4919 generic.go:334] "Generic (PLEG): container finished" podID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerID="c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b" exitCode=0 Mar 10 22:14:02 crc kubenswrapper[4919]: I0310 22:14:02.445903 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" event={"ID":"c696db8f-44d0-42ad-aa56-5e889eef767f","Type":"ContainerDied","Data":"c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b"} Mar 10 22:14:02 crc kubenswrapper[4919]: I0310 22:14:02.471172 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" podStartSLOduration=2.471155192 podStartE2EDuration="2.471155192s" podCreationTimestamp="2026-03-10 22:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:02.464922412 +0000 UTC m=+1429.706803020" watchObservedRunningTime="2026-03-10 22:14:02.471155192 +0000 UTC m=+1429.713035800" Mar 10 22:14:03 crc kubenswrapper[4919]: I0310 22:14:03.994262 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:14:04 crc kubenswrapper[4919]: I0310 22:14:04.007914 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:14:04 crc kubenswrapper[4919]: I0310 22:14:04.323605 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.493050 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64","Type":"ContainerStarted","Data":"8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2"} Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.493675 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64","Type":"ContainerStarted","Data":"aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06"} Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.505705 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b9dac23-a973-4e49-91bc-0a1c0ca4b998","Type":"ContainerStarted","Data":"9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b"} Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.527692 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a194fe14-439b-4a8a-acb8-ea5852e0721b","Type":"ContainerStarted","Data":"a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152"} Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.527745 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a194fe14-439b-4a8a-acb8-ea5852e0721b","Type":"ContainerStarted","Data":"9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc"} Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.527904 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerName="nova-metadata-log" containerID="cri-o://9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc" gracePeriod=30 Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.528223 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerName="nova-metadata-metadata" containerID="cri-o://a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152" gracePeriod=30 Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.535441 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7670166529999998 podStartE2EDuration="6.535423504s" podCreationTimestamp="2026-03-10 22:13:59 +0000 UTC" firstStartedPulling="2026-03-10 22:14:00.94713694 +0000 UTC m=+1428.189017548" lastFinishedPulling="2026-03-10 22:14:04.715543791 +0000 UTC m=+1431.957424399" observedRunningTime="2026-03-10 22:14:05.532988037 +0000 UTC m=+1432.774868655" watchObservedRunningTime="2026-03-10 22:14:05.535423504 +0000 UTC m=+1432.777304112" Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.538114 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" event={"ID":"c696db8f-44d0-42ad-aa56-5e889eef767f","Type":"ContainerStarted","Data":"79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849"} Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.538664 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.550339 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"40528b7c-2c67-4413-ba07-5e3e6af9c18b","Type":"ContainerStarted","Data":"f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a"} Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.550541 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="40528b7c-2c67-4413-ba07-5e3e6af9c18b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a" gracePeriod=30 Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.559994 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.627878084 podStartE2EDuration="6.559975709s" podCreationTimestamp="2026-03-10 22:13:59 +0000 UTC" firstStartedPulling="2026-03-10 22:14:00.78326453 +0000 UTC m=+1428.025145138" lastFinishedPulling="2026-03-10 22:14:04.715362155 +0000 UTC m=+1431.957242763" observedRunningTime="2026-03-10 22:14:05.557168893 +0000 UTC m=+1432.799049501" watchObservedRunningTime="2026-03-10 22:14:05.559975709 +0000 UTC m=+1432.801856327" Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.581115 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" event={"ID":"90cad2d4-b151-4000-82ee-fed894ad117a","Type":"ContainerStarted","Data":"c87e48a8d3a96530e75f3196af093428f39eb925905d84a7251111e71248682b"} Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.595802 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.721394907 podStartE2EDuration="6.595776749s" podCreationTimestamp="2026-03-10 22:13:59 +0000 UTC" firstStartedPulling="2026-03-10 22:14:00.839617926 +0000 UTC m=+1428.081498524" lastFinishedPulling="2026-03-10 22:14:04.713999758 +0000 UTC m=+1431.955880366" observedRunningTime="2026-03-10 22:14:05.590254379 +0000 UTC m=+1432.832134977" watchObservedRunningTime="2026-03-10 22:14:05.595776749 +0000 UTC m=+1432.837657357" Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.628676 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.215399782 podStartE2EDuration="6.62865749s" podCreationTimestamp="2026-03-10 22:13:59 +0000 UTC" firstStartedPulling="2026-03-10 22:14:01.302633332 +0000 UTC m=+1428.544513940" lastFinishedPulling="2026-03-10 22:14:04.71589105 +0000 UTC m=+1431.957771648" observedRunningTime="2026-03-10 22:14:05.617817346 +0000 UTC m=+1432.859697954" watchObservedRunningTime="2026-03-10 22:14:05.62865749 +0000 UTC m=+1432.870538088" Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.675621 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" podStartSLOduration=6.675596861 podStartE2EDuration="6.675596861s" podCreationTimestamp="2026-03-10 22:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:05.646154524 +0000 UTC m=+1432.888035132" watchObservedRunningTime="2026-03-10 22:14:05.675596861 +0000 UTC m=+1432.917477469" Mar 10 22:14:05 crc kubenswrapper[4919]: I0310 22:14:05.718210 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" podStartSLOduration=2.907810713 podStartE2EDuration="5.718189116s" podCreationTimestamp="2026-03-10 22:14:00 +0000 UTC" firstStartedPulling="2026-03-10 22:14:01.287204474 +0000 UTC m=+1428.529085082" lastFinishedPulling="2026-03-10 22:14:04.097582877 +0000 UTC m=+1431.339463485" observedRunningTime="2026-03-10 22:14:05.672541099 +0000 UTC m=+1432.914421717" watchObservedRunningTime="2026-03-10 22:14:05.718189116 +0000 UTC m=+1432.960069724" Mar 10 22:14:06 crc kubenswrapper[4919]: I0310 22:14:06.591876 4919 generic.go:334] "Generic (PLEG): container finished" podID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerID="9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc" exitCode=143 Mar 10 22:14:06 crc kubenswrapper[4919]: I0310 22:14:06.591953 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a194fe14-439b-4a8a-acb8-ea5852e0721b","Type":"ContainerDied","Data":"9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc"} Mar 10 22:14:06 crc kubenswrapper[4919]: I0310 22:14:06.594456 4919 generic.go:334] "Generic (PLEG): container finished" podID="90cad2d4-b151-4000-82ee-fed894ad117a" containerID="c87e48a8d3a96530e75f3196af093428f39eb925905d84a7251111e71248682b" exitCode=0 Mar 10 22:14:06 crc kubenswrapper[4919]: I0310 22:14:06.594536 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" event={"ID":"90cad2d4-b151-4000-82ee-fed894ad117a","Type":"ContainerDied","Data":"c87e48a8d3a96530e75f3196af093428f39eb925905d84a7251111e71248682b"} Mar 10 22:14:07 crc kubenswrapper[4919]: I0310 22:14:07.960234 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" Mar 10 22:14:08 crc kubenswrapper[4919]: I0310 22:14:08.159063 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72gfd\" (UniqueName: \"kubernetes.io/projected/90cad2d4-b151-4000-82ee-fed894ad117a-kube-api-access-72gfd\") pod \"90cad2d4-b151-4000-82ee-fed894ad117a\" (UID: \"90cad2d4-b151-4000-82ee-fed894ad117a\") " Mar 10 22:14:08 crc kubenswrapper[4919]: I0310 22:14:08.165230 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cad2d4-b151-4000-82ee-fed894ad117a-kube-api-access-72gfd" (OuterVolumeSpecName: "kube-api-access-72gfd") pod "90cad2d4-b151-4000-82ee-fed894ad117a" (UID: "90cad2d4-b151-4000-82ee-fed894ad117a"). InnerVolumeSpecName "kube-api-access-72gfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:08 crc kubenswrapper[4919]: I0310 22:14:08.262241 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72gfd\" (UniqueName: \"kubernetes.io/projected/90cad2d4-b151-4000-82ee-fed894ad117a-kube-api-access-72gfd\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:08 crc kubenswrapper[4919]: I0310 22:14:08.612333 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" event={"ID":"90cad2d4-b151-4000-82ee-fed894ad117a","Type":"ContainerDied","Data":"b96a3635614bac06a18b76764bea33b8a1270c4b87e2f8b5a8548409e6211882"} Mar 10 22:14:08 crc kubenswrapper[4919]: I0310 22:14:08.612371 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b96a3635614bac06a18b76764bea33b8a1270c4b87e2f8b5a8548409e6211882" Mar 10 22:14:08 crc kubenswrapper[4919]: I0310 22:14:08.612479 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553014-d5fgg" Mar 10 22:14:08 crc kubenswrapper[4919]: I0310 22:14:08.898742 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:14:08 crc kubenswrapper[4919]: I0310 22:14:08.898964 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="28cf0af6-9a5f-445a-98fc-2251bcd48109" containerName="kube-state-metrics" containerID="cri-o://10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c" gracePeriod=30 Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.039835 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553008-6m44j"] Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.047305 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553008-6m44j"] Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.337856 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.386447 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrcds\" (UniqueName: \"kubernetes.io/projected/28cf0af6-9a5f-445a-98fc-2251bcd48109-kube-api-access-zrcds\") pod \"28cf0af6-9a5f-445a-98fc-2251bcd48109\" (UID: \"28cf0af6-9a5f-445a-98fc-2251bcd48109\") " Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.394643 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cf0af6-9a5f-445a-98fc-2251bcd48109-kube-api-access-zrcds" (OuterVolumeSpecName: "kube-api-access-zrcds") pod "28cf0af6-9a5f-445a-98fc-2251bcd48109" (UID: "28cf0af6-9a5f-445a-98fc-2251bcd48109"). InnerVolumeSpecName "kube-api-access-zrcds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.488459 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrcds\" (UniqueName: \"kubernetes.io/projected/28cf0af6-9a5f-445a-98fc-2251bcd48109-kube-api-access-zrcds\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.490974 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a627d2b5-2999-44fc-a23a-af409711896c" path="/var/lib/kubelet/pods/a627d2b5-2999-44fc-a23a-af409711896c/volumes" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.638603 4919 generic.go:334] "Generic (PLEG): container finished" podID="28cf0af6-9a5f-445a-98fc-2251bcd48109" containerID="10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c" exitCode=2 Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.638680 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"28cf0af6-9a5f-445a-98fc-2251bcd48109","Type":"ContainerDied","Data":"10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c"} Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.638690 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.638707 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"28cf0af6-9a5f-445a-98fc-2251bcd48109","Type":"ContainerDied","Data":"9e143d8bdc4a66e7c6dc9baeb6f30a72c138caba4c1f6093b7ab272417424c34"} Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.638722 4919 scope.go:117] "RemoveContainer" containerID="10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.642916 4919 generic.go:334] "Generic (PLEG): container finished" podID="3a9a0463-5f7a-4164-9fd7-a7bce608bf41" containerID="fcb0215525c257f40fbb026eda215eee7e909386a54b513f6d4e594c4b7c8077" exitCode=0 Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.642962 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tkhg8" event={"ID":"3a9a0463-5f7a-4164-9fd7-a7bce608bf41","Type":"ContainerDied","Data":"fcb0215525c257f40fbb026eda215eee7e909386a54b513f6d4e594c4b7c8077"} Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.677722 4919 scope.go:117] "RemoveContainer" containerID="10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c" Mar 10 22:14:09 crc kubenswrapper[4919]: E0310 22:14:09.678292 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c\": container with ID starting with 10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c not found: ID does not exist" containerID="10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.678324 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c"} err="failed to get container status \"10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c\": rpc error: code = NotFound desc = could not find container \"10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c\": container with ID starting with 10a8e5e7617ef25582ab8c22f76dbeffa74f59e87e46ef9e71ef4b2667694b2c not found: ID does not exist" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.692917 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.710112 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.722599 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:14:09 crc kubenswrapper[4919]: E0310 22:14:09.723132 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cf0af6-9a5f-445a-98fc-2251bcd48109" containerName="kube-state-metrics" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.723155 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cf0af6-9a5f-445a-98fc-2251bcd48109" containerName="kube-state-metrics" Mar 10 22:14:09 crc kubenswrapper[4919]: E0310 22:14:09.723187 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cad2d4-b151-4000-82ee-fed894ad117a" containerName="oc" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.723194 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cad2d4-b151-4000-82ee-fed894ad117a" containerName="oc" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.723431 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cf0af6-9a5f-445a-98fc-2251bcd48109" containerName="kube-state-metrics" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.723455 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cad2d4-b151-4000-82ee-fed894ad117a" containerName="oc" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.724090 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.726441 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.726902 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.748149 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.800216 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.800683 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.800771 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.800839 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w27nx\" (UniqueName: \"kubernetes.io/projected/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-api-access-w27nx\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.876196 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.876558 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.902570 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.902657 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.902725 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w27nx\" (UniqueName: \"kubernetes.io/projected/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-api-access-w27nx\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.902797 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.906978 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.909789 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.910460 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.912488 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.920887 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w27nx\" (UniqueName: \"kubernetes.io/projected/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-api-access-w27nx\") pod \"kube-state-metrics-0\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " pod="openstack/kube-state-metrics-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.957504 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 22:14:09 crc kubenswrapper[4919]: I0310 22:14:09.957561 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.045107 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.174274 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.174322 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.195549 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.261046 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-6whm5"] Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.261580 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" podUID="1fa63050-4c25-4c14-a046-6899bf0de3a0" containerName="dnsmasq-dns" containerID="cri-o://105c15b58bc8544c686157743a78596884bc9236556a606b96aa40fe00b37ce9" gracePeriod=10 Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.388686 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.527910 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.656640 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7bfe342f-267a-4239-a9cc-8df0e3d14a92","Type":"ContainerStarted","Data":"5d4c7ff057178d1d7de6ad2d49ecc6b3b14b932f465cc80369c181b155bfb8be"} Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.662257 4919 generic.go:334] "Generic (PLEG): container finished" podID="1fa63050-4c25-4c14-a046-6899bf0de3a0" containerID="105c15b58bc8544c686157743a78596884bc9236556a606b96aa40fe00b37ce9" exitCode=0 Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.662343 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" event={"ID":"1fa63050-4c25-4c14-a046-6899bf0de3a0","Type":"ContainerDied","Data":"105c15b58bc8544c686157743a78596884bc9236556a606b96aa40fe00b37ce9"} Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.712807 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.792143 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.826545 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-swift-storage-0\") pod \"1fa63050-4c25-4c14-a046-6899bf0de3a0\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.826619 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-sb\") pod \"1fa63050-4c25-4c14-a046-6899bf0de3a0\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.826708 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-config\") pod \"1fa63050-4c25-4c14-a046-6899bf0de3a0\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.826808 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-svc\") pod \"1fa63050-4c25-4c14-a046-6899bf0de3a0\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.826867 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f89ck\" (UniqueName: \"kubernetes.io/projected/1fa63050-4c25-4c14-a046-6899bf0de3a0-kube-api-access-f89ck\") pod \"1fa63050-4c25-4c14-a046-6899bf0de3a0\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.826897 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-nb\") pod \"1fa63050-4c25-4c14-a046-6899bf0de3a0\" (UID: \"1fa63050-4c25-4c14-a046-6899bf0de3a0\") " Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.833418 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa63050-4c25-4c14-a046-6899bf0de3a0-kube-api-access-f89ck" (OuterVolumeSpecName: "kube-api-access-f89ck") pod "1fa63050-4c25-4c14-a046-6899bf0de3a0" (UID: "1fa63050-4c25-4c14-a046-6899bf0de3a0"). InnerVolumeSpecName "kube-api-access-f89ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.917229 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fa63050-4c25-4c14-a046-6899bf0de3a0" (UID: "1fa63050-4c25-4c14-a046-6899bf0de3a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.920833 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-config" (OuterVolumeSpecName: "config") pod "1fa63050-4c25-4c14-a046-6899bf0de3a0" (UID: "1fa63050-4c25-4c14-a046-6899bf0de3a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.925060 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fa63050-4c25-4c14-a046-6899bf0de3a0" (UID: "1fa63050-4c25-4c14-a046-6899bf0de3a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.929649 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.929679 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.929689 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f89ck\" (UniqueName: \"kubernetes.io/projected/1fa63050-4c25-4c14-a046-6899bf0de3a0-kube-api-access-f89ck\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.929699 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.937499 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fa63050-4c25-4c14-a046-6899bf0de3a0" (UID: "1fa63050-4c25-4c14-a046-6899bf0de3a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:10 crc kubenswrapper[4919]: I0310 22:14:10.955866 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fa63050-4c25-4c14-a046-6899bf0de3a0" (UID: "1fa63050-4c25-4c14-a046-6899bf0de3a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.031553 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.031589 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa63050-4c25-4c14-a046-6899bf0de3a0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.128223 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.234202 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-scripts\") pod \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.234503 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-config-data\") pod \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.234641 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrrm9\" (UniqueName: \"kubernetes.io/projected/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-kube-api-access-vrrm9\") pod \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.234709 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-combined-ca-bundle\") pod \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\" (UID: \"3a9a0463-5f7a-4164-9fd7-a7bce608bf41\") " Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.241475 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-scripts" (OuterVolumeSpecName: "scripts") pod "3a9a0463-5f7a-4164-9fd7-a7bce608bf41" (UID: "3a9a0463-5f7a-4164-9fd7-a7bce608bf41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.244379 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-kube-api-access-vrrm9" (OuterVolumeSpecName: "kube-api-access-vrrm9") pod "3a9a0463-5f7a-4164-9fd7-a7bce608bf41" (UID: "3a9a0463-5f7a-4164-9fd7-a7bce608bf41"). InnerVolumeSpecName "kube-api-access-vrrm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.270556 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.270838 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-config-data" (OuterVolumeSpecName: "config-data") pod "3a9a0463-5f7a-4164-9fd7-a7bce608bf41" (UID: "3a9a0463-5f7a-4164-9fd7-a7bce608bf41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.270739 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.281805 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a9a0463-5f7a-4164-9fd7-a7bce608bf41" (UID: "3a9a0463-5f7a-4164-9fd7-a7bce608bf41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.337701 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.337762 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.337775 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrrm9\" (UniqueName: \"kubernetes.io/projected/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-kube-api-access-vrrm9\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.337786 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a9a0463-5f7a-4164-9fd7-a7bce608bf41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.491883 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cf0af6-9a5f-445a-98fc-2251bcd48109" path="/var/lib/kubelet/pods/28cf0af6-9a5f-445a-98fc-2251bcd48109/volumes" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.514545 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.514806 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="ceilometer-central-agent" containerID="cri-o://41cea86b72e1810ac1c93113c6c7092dda1306995f03af03db835ebbae38f192" gracePeriod=30 Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.514941 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="ceilometer-notification-agent" containerID="cri-o://05c3a5f2993798c334cb01686ebbd4d57cdfe8d58fe0a3ba8802b484208f0600" gracePeriod=30 Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.514954 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="sg-core" containerID="cri-o://371d3157e9bbdd52091031696e44edff636a803913a025a8a5308cf9bd2a47de" gracePeriod=30 Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.515082 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="proxy-httpd" containerID="cri-o://a127a403f434a8ffe4fb7b043199f72b41de6f08174a50dae3e141aceebcc45f" gracePeriod=30 Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.681816 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tkhg8" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.681889 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tkhg8" event={"ID":"3a9a0463-5f7a-4164-9fd7-a7bce608bf41","Type":"ContainerDied","Data":"27914f5c02284e0dbaac986ce931c4e363b466dc24a9649d799ef876de6dab1b"} Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.682823 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27914f5c02284e0dbaac986ce931c4e363b466dc24a9649d799ef876de6dab1b" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.700276 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7bfe342f-267a-4239-a9cc-8df0e3d14a92","Type":"ContainerStarted","Data":"22b51ffd48cd3ea852a7427f1dfc881e19ae51a0a5028680360ef0179dcc54e1"} Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.701842 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.718194 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" event={"ID":"1fa63050-4c25-4c14-a046-6899bf0de3a0","Type":"ContainerDied","Data":"54a5846f0762b119f28ccf760491f8a56eb41d52575a1e45f9339592a37c3c55"} Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.718259 4919 scope.go:117] "RemoveContainer" containerID="105c15b58bc8544c686157743a78596884bc9236556a606b96aa40fe00b37ce9" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.718375 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbf7756bf-6whm5" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.732902 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.310410738 podStartE2EDuration="2.732879615s" podCreationTimestamp="2026-03-10 22:14:09 +0000 UTC" firstStartedPulling="2026-03-10 22:14:10.542453952 +0000 UTC m=+1437.784334560" lastFinishedPulling="2026-03-10 22:14:10.964922839 +0000 UTC m=+1438.206803437" observedRunningTime="2026-03-10 22:14:11.725197157 +0000 UTC m=+1438.967077765" watchObservedRunningTime="2026-03-10 22:14:11.732879615 +0000 UTC m=+1438.974760223" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.739930 4919 generic.go:334] "Generic (PLEG): container finished" podID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerID="371d3157e9bbdd52091031696e44edff636a803913a025a8a5308cf9bd2a47de" exitCode=2 Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.741523 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerDied","Data":"371d3157e9bbdd52091031696e44edff636a803913a025a8a5308cf9bd2a47de"} Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.752262 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-6whm5"] Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.755509 4919 scope.go:117] "RemoveContainer" containerID="d613139a230ae8aa02383da619842787d20f3871a917a1d083fc34928c5e3425" Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.769576 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cbf7756bf-6whm5"] Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.834051 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.834263 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-log" containerID="cri-o://aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06" gracePeriod=30 Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.835802 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-api" containerID="cri-o://8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2" gracePeriod=30 Mar 10 22:14:11 crc kubenswrapper[4919]: I0310 22:14:11.958366 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.755485 4919 generic.go:334] "Generic (PLEG): container finished" podID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerID="a127a403f434a8ffe4fb7b043199f72b41de6f08174a50dae3e141aceebcc45f" exitCode=0 Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.755523 4919 generic.go:334] "Generic (PLEG): container finished" podID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerID="05c3a5f2993798c334cb01686ebbd4d57cdfe8d58fe0a3ba8802b484208f0600" exitCode=0 Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.755532 4919 generic.go:334] "Generic (PLEG): container finished" podID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerID="41cea86b72e1810ac1c93113c6c7092dda1306995f03af03db835ebbae38f192" exitCode=0 Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.755565 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerDied","Data":"a127a403f434a8ffe4fb7b043199f72b41de6f08174a50dae3e141aceebcc45f"} Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.755587 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerDied","Data":"05c3a5f2993798c334cb01686ebbd4d57cdfe8d58fe0a3ba8802b484208f0600"} Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.755596 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerDied","Data":"41cea86b72e1810ac1c93113c6c7092dda1306995f03af03db835ebbae38f192"} Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.757000 4919 generic.go:334] "Generic (PLEG): container finished" podID="da024d80-ca57-41a6-b46a-508015462b2d" containerID="92cf38036c784cc198b471c45b92409cdc090c7ce641ef5b3d72746ab8ddf341" exitCode=0 Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.757089 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" event={"ID":"da024d80-ca57-41a6-b46a-508015462b2d","Type":"ContainerDied","Data":"92cf38036c784cc198b471c45b92409cdc090c7ce641ef5b3d72746ab8ddf341"} Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.761462 4919 generic.go:334] "Generic (PLEG): container finished" podID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerID="aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06" exitCode=143 Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.762398 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64","Type":"ContainerDied","Data":"aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06"} Mar 10 22:14:12 crc kubenswrapper[4919]: I0310 22:14:12.952855 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.079349 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-run-httpd\") pod \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.079419 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-scripts\") pod \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.079517 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-log-httpd\") pod \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.079915 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "185a485d-24ea-44f8-bac9-0f4ddc4298ac" (UID: "185a485d-24ea-44f8-bac9-0f4ddc4298ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.080101 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "185a485d-24ea-44f8-bac9-0f4ddc4298ac" (UID: "185a485d-24ea-44f8-bac9-0f4ddc4298ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.079568 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-sg-core-conf-yaml\") pod \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.080200 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9wjx\" (UniqueName: \"kubernetes.io/projected/185a485d-24ea-44f8-bac9-0f4ddc4298ac-kube-api-access-h9wjx\") pod \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.080228 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-config-data\") pod \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.080257 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-combined-ca-bundle\") pod \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.080861 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.080881 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/185a485d-24ea-44f8-bac9-0f4ddc4298ac-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.086099 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-scripts" (OuterVolumeSpecName: "scripts") pod "185a485d-24ea-44f8-bac9-0f4ddc4298ac" (UID: "185a485d-24ea-44f8-bac9-0f4ddc4298ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.086576 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185a485d-24ea-44f8-bac9-0f4ddc4298ac-kube-api-access-h9wjx" (OuterVolumeSpecName: "kube-api-access-h9wjx") pod "185a485d-24ea-44f8-bac9-0f4ddc4298ac" (UID: "185a485d-24ea-44f8-bac9-0f4ddc4298ac"). InnerVolumeSpecName "kube-api-access-h9wjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.126103 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "185a485d-24ea-44f8-bac9-0f4ddc4298ac" (UID: "185a485d-24ea-44f8-bac9-0f4ddc4298ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.187442 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "185a485d-24ea-44f8-bac9-0f4ddc4298ac" (UID: "185a485d-24ea-44f8-bac9-0f4ddc4298ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.191691 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-combined-ca-bundle\") pod \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\" (UID: \"185a485d-24ea-44f8-bac9-0f4ddc4298ac\") " Mar 10 22:14:13 crc kubenswrapper[4919]: W0310 22:14:13.192082 4919 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/185a485d-24ea-44f8-bac9-0f4ddc4298ac/volumes/kubernetes.io~secret/combined-ca-bundle Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.192814 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "185a485d-24ea-44f8-bac9-0f4ddc4298ac" (UID: "185a485d-24ea-44f8-bac9-0f4ddc4298ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.195114 4919 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.195256 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9wjx\" (UniqueName: \"kubernetes.io/projected/185a485d-24ea-44f8-bac9-0f4ddc4298ac-kube-api-access-h9wjx\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.195356 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.195474 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.206320 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-config-data" (OuterVolumeSpecName: "config-data") pod "185a485d-24ea-44f8-bac9-0f4ddc4298ac" (UID: "185a485d-24ea-44f8-bac9-0f4ddc4298ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.297067 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/185a485d-24ea-44f8-bac9-0f4ddc4298ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.505868 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa63050-4c25-4c14-a046-6899bf0de3a0" path="/var/lib/kubelet/pods/1fa63050-4c25-4c14-a046-6899bf0de3a0/volumes" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.782784 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.782862 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"185a485d-24ea-44f8-bac9-0f4ddc4298ac","Type":"ContainerDied","Data":"6287dc502f96de7ba8ee9642b48eb9b3fceaf41d386f0f2aada2a7d0d4b8ce1b"} Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.782925 4919 scope.go:117] "RemoveContainer" containerID="a127a403f434a8ffe4fb7b043199f72b41de6f08174a50dae3e141aceebcc45f" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.783483 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2b9dac23-a973-4e49-91bc-0a1c0ca4b998" containerName="nova-scheduler-scheduler" containerID="cri-o://9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b" gracePeriod=30 Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.817259 4919 scope.go:117] "RemoveContainer" containerID="371d3157e9bbdd52091031696e44edff636a803913a025a8a5308cf9bd2a47de" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.818329 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.846505 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858124 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:13 crc kubenswrapper[4919]: E0310 22:14:13.858554 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa63050-4c25-4c14-a046-6899bf0de3a0" containerName="init" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858571 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa63050-4c25-4c14-a046-6899bf0de3a0" containerName="init" Mar 10 22:14:13 crc kubenswrapper[4919]: E0310 22:14:13.858583 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="proxy-httpd" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858591 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="proxy-httpd" Mar 10 22:14:13 crc kubenswrapper[4919]: E0310 22:14:13.858600 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="sg-core" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858606 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="sg-core" Mar 10 22:14:13 crc kubenswrapper[4919]: E0310 22:14:13.858625 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9a0463-5f7a-4164-9fd7-a7bce608bf41" containerName="nova-manage" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858630 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9a0463-5f7a-4164-9fd7-a7bce608bf41" containerName="nova-manage" Mar 10 22:14:13 crc kubenswrapper[4919]: E0310 22:14:13.858645 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa63050-4c25-4c14-a046-6899bf0de3a0" containerName="dnsmasq-dns" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858651 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa63050-4c25-4c14-a046-6899bf0de3a0" containerName="dnsmasq-dns" Mar 10 22:14:13 crc kubenswrapper[4919]: E0310 22:14:13.858661 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="ceilometer-notification-agent" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858667 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="ceilometer-notification-agent" Mar 10 22:14:13 crc kubenswrapper[4919]: E0310 22:14:13.858675 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="ceilometer-central-agent" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858680 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="ceilometer-central-agent" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858833 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="proxy-httpd" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858852 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="ceilometer-notification-agent" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858861 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9a0463-5f7a-4164-9fd7-a7bce608bf41" containerName="nova-manage" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858868 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa63050-4c25-4c14-a046-6899bf0de3a0" containerName="dnsmasq-dns" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858880 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="sg-core" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.858888 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" containerName="ceilometer-central-agent" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.866322 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.892538 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.892928 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.915450 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.943456 4919 scope.go:117] "RemoveContainer" containerID="05c3a5f2993798c334cb01686ebbd4d57cdfe8d58fe0a3ba8802b484208f0600" Mar 10 22:14:13 crc kubenswrapper[4919]: I0310 22:14:13.969082 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.024315 4919 scope.go:117] "RemoveContainer" containerID="41cea86b72e1810ac1c93113c6c7092dda1306995f03af03db835ebbae38f192" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.026678 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-config-data\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.026976 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5ftw\" (UniqueName: \"kubernetes.io/projected/76b0dfed-aec2-4594-84f7-15f4489d87e4-kube-api-access-r5ftw\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.027074 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.027152 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-run-httpd\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.027268 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.027357 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.034924 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-log-httpd\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.035323 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-scripts\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.138654 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-scripts\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.138937 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-config-data\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.138985 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5ftw\" (UniqueName: \"kubernetes.io/projected/76b0dfed-aec2-4594-84f7-15f4489d87e4-kube-api-access-r5ftw\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.139084 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.139104 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-run-httpd\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.139158 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.139194 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.139215 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-log-httpd\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.139663 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-log-httpd\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.139870 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-run-httpd\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.144967 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.145620 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.155661 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-scripts\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.155743 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5ftw\" (UniqueName: \"kubernetes.io/projected/76b0dfed-aec2-4594-84f7-15f4489d87e4-kube-api-access-r5ftw\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.159005 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-config-data\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.159239 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.231629 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.328979 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.447705 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86bbb\" (UniqueName: \"kubernetes.io/projected/da024d80-ca57-41a6-b46a-508015462b2d-kube-api-access-86bbb\") pod \"da024d80-ca57-41a6-b46a-508015462b2d\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.448551 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-scripts\") pod \"da024d80-ca57-41a6-b46a-508015462b2d\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.449142 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-combined-ca-bundle\") pod \"da024d80-ca57-41a6-b46a-508015462b2d\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.449202 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-config-data\") pod \"da024d80-ca57-41a6-b46a-508015462b2d\" (UID: \"da024d80-ca57-41a6-b46a-508015462b2d\") " Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.453514 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-scripts" (OuterVolumeSpecName: "scripts") pod "da024d80-ca57-41a6-b46a-508015462b2d" (UID: "da024d80-ca57-41a6-b46a-508015462b2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.453581 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da024d80-ca57-41a6-b46a-508015462b2d-kube-api-access-86bbb" (OuterVolumeSpecName: "kube-api-access-86bbb") pod "da024d80-ca57-41a6-b46a-508015462b2d" (UID: "da024d80-ca57-41a6-b46a-508015462b2d"). InnerVolumeSpecName "kube-api-access-86bbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.473892 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da024d80-ca57-41a6-b46a-508015462b2d" (UID: "da024d80-ca57-41a6-b46a-508015462b2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.481157 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-config-data" (OuterVolumeSpecName: "config-data") pod "da024d80-ca57-41a6-b46a-508015462b2d" (UID: "da024d80-ca57-41a6-b46a-508015462b2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.552203 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.552467 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.552478 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da024d80-ca57-41a6-b46a-508015462b2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.552488 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86bbb\" (UniqueName: \"kubernetes.io/projected/da024d80-ca57-41a6-b46a-508015462b2d-kube-api-access-86bbb\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.713530 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.793272 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.793275 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lxbqg" event={"ID":"da024d80-ca57-41a6-b46a-508015462b2d","Type":"ContainerDied","Data":"c645d0a50ebfa46a9e6e806503e4b2d6c84235d386af51a86717920e70ac2a2e"} Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.793334 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c645d0a50ebfa46a9e6e806503e4b2d6c84235d386af51a86717920e70ac2a2e" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.795632 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerStarted","Data":"152192f4284b66b713a93ee5ef3eec0262ea83f63b3b32f7aa324997e46cfe46"} Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.863468 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 22:14:14 crc kubenswrapper[4919]: E0310 22:14:14.863914 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da024d80-ca57-41a6-b46a-508015462b2d" containerName="nova-cell1-conductor-db-sync" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.863933 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="da024d80-ca57-41a6-b46a-508015462b2d" containerName="nova-cell1-conductor-db-sync" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.864139 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="da024d80-ca57-41a6-b46a-508015462b2d" containerName="nova-cell1-conductor-db-sync" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.864796 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.867006 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.876991 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 22:14:14 crc kubenswrapper[4919]: E0310 22:14:14.889736 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 22:14:14 crc kubenswrapper[4919]: E0310 22:14:14.892862 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 22:14:14 crc kubenswrapper[4919]: E0310 22:14:14.897328 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 22:14:14 crc kubenswrapper[4919]: E0310 22:14:14.897572 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2b9dac23-a973-4e49-91bc-0a1c0ca4b998" containerName="nova-scheduler-scheduler" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.958202 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.958491 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbrmd\" (UniqueName: \"kubernetes.io/projected/62814b8d-8679-4350-be7d-5f729f901846-kube-api-access-pbrmd\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:14 crc kubenswrapper[4919]: I0310 22:14:14.958601 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.059940 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.060286 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbrmd\" (UniqueName: \"kubernetes.io/projected/62814b8d-8679-4350-be7d-5f729f901846-kube-api-access-pbrmd\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.060448 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.064677 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.065331 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.075730 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbrmd\" (UniqueName: \"kubernetes.io/projected/62814b8d-8679-4350-be7d-5f729f901846-kube-api-access-pbrmd\") pod \"nova-cell1-conductor-0\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.180181 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.504125 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185a485d-24ea-44f8-bac9-0f4ddc4298ac" path="/var/lib/kubelet/pods/185a485d-24ea-44f8-bac9-0f4ddc4298ac/volumes" Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.672470 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.810527 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"62814b8d-8679-4350-be7d-5f729f901846","Type":"ContainerStarted","Data":"97fc761a4e4ac4aeb8eb3d0a2697db50bd518ae1c105070a5ec937cea92c0151"} Mar 10 22:14:15 crc kubenswrapper[4919]: I0310 22:14:15.813604 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerStarted","Data":"277e57562cee12fa7ec9cd814300875d070319f6b06a2201258191489b069a76"} Mar 10 22:14:16 crc kubenswrapper[4919]: I0310 22:14:16.826761 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"62814b8d-8679-4350-be7d-5f729f901846","Type":"ContainerStarted","Data":"ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6"} Mar 10 22:14:16 crc kubenswrapper[4919]: I0310 22:14:16.827630 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:16 crc kubenswrapper[4919]: I0310 22:14:16.829279 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerStarted","Data":"5e17384c3443979b8dcda5b3b440dc722ffa6e656bd894db4223ec39ae6dfc5d"} Mar 10 22:14:16 crc kubenswrapper[4919]: I0310 22:14:16.852855 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.852831164 podStartE2EDuration="2.852831164s" podCreationTimestamp="2026-03-10 22:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:16.842883984 +0000 UTC m=+1444.084764602" watchObservedRunningTime="2026-03-10 22:14:16.852831164 +0000 UTC m=+1444.094711772" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.666547 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.824943 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.839700 4919 generic.go:334] "Generic (PLEG): container finished" podID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerID="8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2" exitCode=0 Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.839766 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64","Type":"ContainerDied","Data":"8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2"} Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.839810 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64","Type":"ContainerDied","Data":"4f10c98e7f5ab9df7ceb8a470f6c457a9848d820fdca108c5f1df8a06b4a3361"} Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.839829 4919 scope.go:117] "RemoveContainer" containerID="8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.839960 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.866484 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-combined-ca-bundle\") pod \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.866711 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-config-data\") pod \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.866741 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmtv4\" (UniqueName: \"kubernetes.io/projected/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-kube-api-access-pmtv4\") pod \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\" (UID: \"2b9dac23-a973-4e49-91bc-0a1c0ca4b998\") " Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.872679 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerStarted","Data":"38e5b0c62a33d19485359f3bc8aa610f75dd2eacaadd6626fc7201f23142cfe4"} Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.876626 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-kube-api-access-pmtv4" (OuterVolumeSpecName: "kube-api-access-pmtv4") pod "2b9dac23-a973-4e49-91bc-0a1c0ca4b998" (UID: "2b9dac23-a973-4e49-91bc-0a1c0ca4b998"). InnerVolumeSpecName "kube-api-access-pmtv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.882597 4919 scope.go:117] "RemoveContainer" containerID="aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.886842 4919 generic.go:334] "Generic (PLEG): container finished" podID="2b9dac23-a973-4e49-91bc-0a1c0ca4b998" containerID="9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b" exitCode=0 Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.887772 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.888233 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b9dac23-a973-4e49-91bc-0a1c0ca4b998","Type":"ContainerDied","Data":"9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b"} Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.888264 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b9dac23-a973-4e49-91bc-0a1c0ca4b998","Type":"ContainerDied","Data":"dfc27311966fe11ec45ad0c6062d2eeeceb81dc5b761887c18ee173c88bdb02e"} Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.941588 4919 scope.go:117] "RemoveContainer" containerID="8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2" Mar 10 22:14:17 crc kubenswrapper[4919]: E0310 22:14:17.945559 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2\": container with ID starting with 8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2 not found: ID does not exist" containerID="8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.945607 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2"} err="failed to get container status \"8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2\": rpc error: code = NotFound desc = could not find container \"8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2\": container with ID starting with 8e73b7ac472be8ed998b5633fdf3401fbbfa36afb2a21eae9b9838a31fbf8bc2 not found: ID does not exist" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.945632 4919 scope.go:117] "RemoveContainer" containerID="aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06" Mar 10 22:14:17 crc kubenswrapper[4919]: E0310 22:14:17.949645 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06\": container with ID starting with aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06 not found: ID does not exist" containerID="aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.949704 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06"} err="failed to get container status \"aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06\": rpc error: code = NotFound desc = could not find container \"aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06\": container with ID starting with aec519983db64c35b7cd0fd778f2f6c5d4d6a401422e68bd689062c806f8eb06 not found: ID does not exist" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.949735 4919 scope.go:117] "RemoveContainer" containerID="9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.953772 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b9dac23-a973-4e49-91bc-0a1c0ca4b998" (UID: "2b9dac23-a973-4e49-91bc-0a1c0ca4b998"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.968516 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-config-data" (OuterVolumeSpecName: "config-data") pod "2b9dac23-a973-4e49-91bc-0a1c0ca4b998" (UID: "2b9dac23-a973-4e49-91bc-0a1c0ca4b998"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.969126 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-combined-ca-bundle\") pod \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.969192 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-config-data\") pod \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.969228 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-logs\") pod \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.969515 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqtm2\" (UniqueName: \"kubernetes.io/projected/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-kube-api-access-jqtm2\") pod \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\" (UID: \"9233a6db-477d-49d8-b4cd-e1ff4dcfdf64\") " Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.970036 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.970058 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmtv4\" (UniqueName: \"kubernetes.io/projected/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-kube-api-access-pmtv4\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.970071 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9dac23-a973-4e49-91bc-0a1c0ca4b998-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.973017 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-logs" (OuterVolumeSpecName: "logs") pod "9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" (UID: "9233a6db-477d-49d8-b4cd-e1ff4dcfdf64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:17 crc kubenswrapper[4919]: I0310 22:14:17.977846 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-kube-api-access-jqtm2" (OuterVolumeSpecName: "kube-api-access-jqtm2") pod "9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" (UID: "9233a6db-477d-49d8-b4cd-e1ff4dcfdf64"). InnerVolumeSpecName "kube-api-access-jqtm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.015195 4919 scope.go:117] "RemoveContainer" containerID="9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b" Mar 10 22:14:18 crc kubenswrapper[4919]: E0310 22:14:18.022033 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b\": container with ID starting with 9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b not found: ID does not exist" containerID="9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.022341 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b"} err="failed to get container status \"9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b\": rpc error: code = NotFound desc = could not find container \"9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b\": container with ID starting with 9191c1eb6cc3803fa081fb82c3628a7177f29066dce3a180d7adde16f4c81e3b not found: ID does not exist" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.027522 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" (UID: "9233a6db-477d-49d8-b4cd-e1ff4dcfdf64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.027566 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-config-data" (OuterVolumeSpecName: "config-data") pod "9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" (UID: "9233a6db-477d-49d8-b4cd-e1ff4dcfdf64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.072512 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.072555 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.072570 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqtm2\" (UniqueName: \"kubernetes.io/projected/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-kube-api-access-jqtm2\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.072583 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.222339 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.235539 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.257484 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.271352 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.279739 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:18 crc kubenswrapper[4919]: E0310 22:14:18.280135 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9dac23-a973-4e49-91bc-0a1c0ca4b998" containerName="nova-scheduler-scheduler" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.280153 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9dac23-a973-4e49-91bc-0a1c0ca4b998" containerName="nova-scheduler-scheduler" Mar 10 22:14:18 crc kubenswrapper[4919]: E0310 22:14:18.280161 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-log" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.280167 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-log" Mar 10 22:14:18 crc kubenswrapper[4919]: E0310 22:14:18.280194 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-api" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.280202 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-api" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.280445 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-log" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.280457 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" containerName="nova-api-api" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.280482 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9dac23-a973-4e49-91bc-0a1c0ca4b998" containerName="nova-scheduler-scheduler" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.281572 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.285789 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.290299 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.291471 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.292806 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.297872 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.311437 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.378895 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-config-data\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.378967 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.379005 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab170d36-cd71-428b-bb07-0b7192035f1d-logs\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.379073 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-config-data\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.379098 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qvwd\" (UniqueName: \"kubernetes.io/projected/64c699c8-f2b1-421d-b617-ec6b7b7c806b-kube-api-access-7qvwd\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.379125 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcmpx\" (UniqueName: \"kubernetes.io/projected/ab170d36-cd71-428b-bb07-0b7192035f1d-kube-api-access-rcmpx\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.379170 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.480421 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab170d36-cd71-428b-bb07-0b7192035f1d-logs\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.480499 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-config-data\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.480537 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qvwd\" (UniqueName: \"kubernetes.io/projected/64c699c8-f2b1-421d-b617-ec6b7b7c806b-kube-api-access-7qvwd\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.480558 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcmpx\" (UniqueName: \"kubernetes.io/projected/ab170d36-cd71-428b-bb07-0b7192035f1d-kube-api-access-rcmpx\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.480603 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.480673 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-config-data\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.480740 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.482077 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab170d36-cd71-428b-bb07-0b7192035f1d-logs\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.484857 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.484992 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-config-data\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.485554 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-config-data\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.493200 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.502076 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcmpx\" (UniqueName: \"kubernetes.io/projected/ab170d36-cd71-428b-bb07-0b7192035f1d-kube-api-access-rcmpx\") pod \"nova-api-0\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.512606 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qvwd\" (UniqueName: \"kubernetes.io/projected/64c699c8-f2b1-421d-b617-ec6b7b7c806b-kube-api-access-7qvwd\") pod \"nova-scheduler-0\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.609505 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.615293 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.913799 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerStarted","Data":"c632fff9503baa94e2de04d3b24fb507386b4cee0ac9b0604aefc6e9ec7d192e"} Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.915161 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 22:14:18 crc kubenswrapper[4919]: I0310 22:14:18.956635 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.171412808 podStartE2EDuration="5.956611163s" podCreationTimestamp="2026-03-10 22:14:13 +0000 UTC" firstStartedPulling="2026-03-10 22:14:14.720997665 +0000 UTC m=+1441.962878273" lastFinishedPulling="2026-03-10 22:14:18.50619603 +0000 UTC m=+1445.748076628" observedRunningTime="2026-03-10 22:14:18.951881655 +0000 UTC m=+1446.193762273" watchObservedRunningTime="2026-03-10 22:14:18.956611163 +0000 UTC m=+1446.198491771" Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.151683 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.234385 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.528453 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9dac23-a973-4e49-91bc-0a1c0ca4b998" path="/var/lib/kubelet/pods/2b9dac23-a973-4e49-91bc-0a1c0ca4b998/volumes" Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.529286 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9233a6db-477d-49d8-b4cd-e1ff4dcfdf64" path="/var/lib/kubelet/pods/9233a6db-477d-49d8-b4cd-e1ff4dcfdf64/volumes" Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.945029 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab170d36-cd71-428b-bb07-0b7192035f1d","Type":"ContainerStarted","Data":"cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01"} Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.945416 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab170d36-cd71-428b-bb07-0b7192035f1d","Type":"ContainerStarted","Data":"55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33"} Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.945437 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab170d36-cd71-428b-bb07-0b7192035f1d","Type":"ContainerStarted","Data":"6538328591ed81061881cb63e936aadef6c5994ff40a297deec4e853f5f1aa46"} Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.947002 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64c699c8-f2b1-421d-b617-ec6b7b7c806b","Type":"ContainerStarted","Data":"55b9f720f76d4e27cdd8fce568c10b873fab76107f554ff240b1936a023cca7d"} Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.947038 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64c699c8-f2b1-421d-b617-ec6b7b7c806b","Type":"ContainerStarted","Data":"e8370c882bb1998db0d8ae6781f21009f518de0cf12fdec144409ca7586ec4ee"} Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.967886 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.9678699819999999 podStartE2EDuration="1.967869982s" podCreationTimestamp="2026-03-10 22:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:19.961317915 +0000 UTC m=+1447.203198523" watchObservedRunningTime="2026-03-10 22:14:19.967869982 +0000 UTC m=+1447.209750590" Mar 10 22:14:19 crc kubenswrapper[4919]: I0310 22:14:19.982456 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.982374435 podStartE2EDuration="1.982374435s" podCreationTimestamp="2026-03-10 22:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:19.979792395 +0000 UTC m=+1447.221673023" watchObservedRunningTime="2026-03-10 22:14:19.982374435 +0000 UTC m=+1447.224255043" Mar 10 22:14:20 crc kubenswrapper[4919]: I0310 22:14:20.060410 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 22:14:20 crc kubenswrapper[4919]: I0310 22:14:20.218206 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 22:14:23 crc kubenswrapper[4919]: I0310 22:14:23.616148 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 22:14:28 crc kubenswrapper[4919]: I0310 22:14:28.610455 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 22:14:28 crc kubenswrapper[4919]: I0310 22:14:28.612935 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 22:14:28 crc kubenswrapper[4919]: I0310 22:14:28.616221 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 22:14:28 crc kubenswrapper[4919]: I0310 22:14:28.649956 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 22:14:29 crc kubenswrapper[4919]: I0310 22:14:29.068108 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 22:14:29 crc kubenswrapper[4919]: I0310 22:14:29.693579 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 22:14:29 crc kubenswrapper[4919]: I0310 22:14:29.693963 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 22:14:35 crc kubenswrapper[4919]: E0310 22:14:35.930875 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda194fe14_439b_4a8a_acb8_ea5852e0721b.slice/crio-conmon-a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda194fe14_439b_4a8a_acb8_ea5852e0721b.slice/crio-a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40528b7c_2c67_4413_ba07_5e3e6af9c18b.slice/crio-conmon-f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40528b7c_2c67_4413_ba07_5e3e6af9c18b.slice/crio-f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a.scope\": RecentStats: unable to find data in memory cache]" Mar 10 22:14:35 crc kubenswrapper[4919]: I0310 22:14:35.956296 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.021382 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.065305 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fwlj\" (UniqueName: \"kubernetes.io/projected/a194fe14-439b-4a8a-acb8-ea5852e0721b-kube-api-access-6fwlj\") pod \"a194fe14-439b-4a8a-acb8-ea5852e0721b\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.065581 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-config-data\") pod \"a194fe14-439b-4a8a-acb8-ea5852e0721b\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.065688 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a194fe14-439b-4a8a-acb8-ea5852e0721b-logs\") pod \"a194fe14-439b-4a8a-acb8-ea5852e0721b\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.065728 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-combined-ca-bundle\") pod \"a194fe14-439b-4a8a-acb8-ea5852e0721b\" (UID: \"a194fe14-439b-4a8a-acb8-ea5852e0721b\") " Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.067103 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a194fe14-439b-4a8a-acb8-ea5852e0721b-logs" (OuterVolumeSpecName: "logs") pod "a194fe14-439b-4a8a-acb8-ea5852e0721b" (UID: "a194fe14-439b-4a8a-acb8-ea5852e0721b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.071044 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a194fe14-439b-4a8a-acb8-ea5852e0721b-kube-api-access-6fwlj" (OuterVolumeSpecName: "kube-api-access-6fwlj") pod "a194fe14-439b-4a8a-acb8-ea5852e0721b" (UID: "a194fe14-439b-4a8a-acb8-ea5852e0721b"). InnerVolumeSpecName "kube-api-access-6fwlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.091618 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-config-data" (OuterVolumeSpecName: "config-data") pod "a194fe14-439b-4a8a-acb8-ea5852e0721b" (UID: "a194fe14-439b-4a8a-acb8-ea5852e0721b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.099733 4919 generic.go:334] "Generic (PLEG): container finished" podID="40528b7c-2c67-4413-ba07-5e3e6af9c18b" containerID="f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a" exitCode=137 Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.099796 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"40528b7c-2c67-4413-ba07-5e3e6af9c18b","Type":"ContainerDied","Data":"f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a"} Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.099821 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"40528b7c-2c67-4413-ba07-5e3e6af9c18b","Type":"ContainerDied","Data":"14fc9c8f0059edfecaa123bb4639e1593c73c895fc6d4e9cd011404563d93cba"} Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.099839 4919 scope.go:117] "RemoveContainer" containerID="f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.099859 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.101180 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a194fe14-439b-4a8a-acb8-ea5852e0721b" (UID: "a194fe14-439b-4a8a-acb8-ea5852e0721b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.103570 4919 generic.go:334] "Generic (PLEG): container finished" podID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerID="a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152" exitCode=137 Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.103648 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.103608 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a194fe14-439b-4a8a-acb8-ea5852e0721b","Type":"ContainerDied","Data":"a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152"} Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.103725 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a194fe14-439b-4a8a-acb8-ea5852e0721b","Type":"ContainerDied","Data":"4bbb82b02c3e7eeb0c6f9e4f3f975f2842975ced8b89f5486b1b4ea219d7c936"} Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.136487 4919 scope.go:117] "RemoveContainer" containerID="f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a" Mar 10 22:14:36 crc kubenswrapper[4919]: E0310 22:14:36.137127 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a\": container with ID starting with f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a not found: ID does not exist" containerID="f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.137176 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a"} err="failed to get container status \"f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a\": rpc error: code = NotFound desc = could not find container \"f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a\": container with ID starting with f85528e61f9082369f6714d5db2710672443323119740a66a03227d7ef379e5a not found: ID does not exist" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.137203 4919 scope.go:117] "RemoveContainer" containerID="a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.141138 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.163086 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.165843 4919 scope.go:117] "RemoveContainer" containerID="9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.167004 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5trd9\" (UniqueName: \"kubernetes.io/projected/40528b7c-2c67-4413-ba07-5e3e6af9c18b-kube-api-access-5trd9\") pod \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.167061 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-combined-ca-bundle\") pod \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.167207 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-config-data\") pod \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\" (UID: \"40528b7c-2c67-4413-ba07-5e3e6af9c18b\") " Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.167910 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fwlj\" (UniqueName: \"kubernetes.io/projected/a194fe14-439b-4a8a-acb8-ea5852e0721b-kube-api-access-6fwlj\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.167935 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.167947 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a194fe14-439b-4a8a-acb8-ea5852e0721b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.167959 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a194fe14-439b-4a8a-acb8-ea5852e0721b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.174316 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40528b7c-2c67-4413-ba07-5e3e6af9c18b-kube-api-access-5trd9" (OuterVolumeSpecName: "kube-api-access-5trd9") pod "40528b7c-2c67-4413-ba07-5e3e6af9c18b" (UID: "40528b7c-2c67-4413-ba07-5e3e6af9c18b"). InnerVolumeSpecName "kube-api-access-5trd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.187622 4919 scope.go:117] "RemoveContainer" containerID="a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152" Mar 10 22:14:36 crc kubenswrapper[4919]: E0310 22:14:36.188047 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152\": container with ID starting with a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152 not found: ID does not exist" containerID="a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.188089 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152"} err="failed to get container status \"a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152\": rpc error: code = NotFound desc = could not find container \"a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152\": container with ID starting with a822a4830c2ddac2986bb0da695e888f47bdee1c9537c48e8a9eca34965b4152 not found: ID does not exist" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.188117 4919 scope.go:117] "RemoveContainer" containerID="9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc" Mar 10 22:14:36 crc kubenswrapper[4919]: E0310 22:14:36.188304 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc\": container with ID starting with 9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc not found: ID does not exist" containerID="9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.188344 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc"} err="failed to get container status \"9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc\": rpc error: code = NotFound desc = could not find container \"9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc\": container with ID starting with 9ca64caff83423ef820e88e7066f059696e286b5743e772904209c47956fe9cc not found: ID does not exist" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.188747 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:14:36 crc kubenswrapper[4919]: E0310 22:14:36.189232 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerName="nova-metadata-metadata" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.189256 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerName="nova-metadata-metadata" Mar 10 22:14:36 crc kubenswrapper[4919]: E0310 22:14:36.189299 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40528b7c-2c67-4413-ba07-5e3e6af9c18b" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.189308 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="40528b7c-2c67-4413-ba07-5e3e6af9c18b" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 22:14:36 crc kubenswrapper[4919]: E0310 22:14:36.189329 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerName="nova-metadata-log" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.189337 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerName="nova-metadata-log" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.189632 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="40528b7c-2c67-4413-ba07-5e3e6af9c18b" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.189663 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerName="nova-metadata-metadata" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.189685 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" containerName="nova-metadata-log" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.191006 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.193211 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.193258 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40528b7c-2c67-4413-ba07-5e3e6af9c18b" (UID: "40528b7c-2c67-4413-ba07-5e3e6af9c18b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.194877 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.205356 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-config-data" (OuterVolumeSpecName: "config-data") pod "40528b7c-2c67-4413-ba07-5e3e6af9c18b" (UID: "40528b7c-2c67-4413-ba07-5e3e6af9c18b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.211304 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.269545 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5trd9\" (UniqueName: \"kubernetes.io/projected/40528b7c-2c67-4413-ba07-5e3e6af9c18b-kube-api-access-5trd9\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.269592 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.269606 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40528b7c-2c67-4413-ba07-5e3e6af9c18b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.371669 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.371732 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-config-data\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.371904 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqjd4\" (UniqueName: \"kubernetes.io/projected/fb1090fb-ea87-4cce-a1fb-05cb31306efa-kube-api-access-cqjd4\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.371932 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1090fb-ea87-4cce-a1fb-05cb31306efa-logs\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.371962 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.438751 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.460777 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.471650 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.472736 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.475409 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.475450 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-config-data\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.475573 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqjd4\" (UniqueName: \"kubernetes.io/projected/fb1090fb-ea87-4cce-a1fb-05cb31306efa-kube-api-access-cqjd4\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.475592 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1090fb-ea87-4cce-a1fb-05cb31306efa-logs\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.475610 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.479739 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.481050 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.481536 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.481706 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.481846 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.483268 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1090fb-ea87-4cce-a1fb-05cb31306efa-logs\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.483943 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.485358 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-config-data\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.525957 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqjd4\" (UniqueName: \"kubernetes.io/projected/fb1090fb-ea87-4cce-a1fb-05cb31306efa-kube-api-access-cqjd4\") pod \"nova-metadata-0\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.577519 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.577561 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.577655 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.577716 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.577754 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv75l\" (UniqueName: \"kubernetes.io/projected/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-kube-api-access-wv75l\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.678997 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.679039 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.679075 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.679122 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.679168 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv75l\" (UniqueName: \"kubernetes.io/projected/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-kube-api-access-wv75l\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.683015 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.683068 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.683146 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.690884 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.710833 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv75l\" (UniqueName: \"kubernetes.io/projected/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-kube-api-access-wv75l\") pod \"nova-cell1-novncproxy-0\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.817368 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:14:36 crc kubenswrapper[4919]: I0310 22:14:36.838205 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:37 crc kubenswrapper[4919]: W0310 22:14:37.270814 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb1090fb_ea87_4cce_a1fb_05cb31306efa.slice/crio-ffb92b6294a3f113e77f8964fbd84505efd5118e48c503577ec14800906e75ad WatchSource:0}: Error finding container ffb92b6294a3f113e77f8964fbd84505efd5118e48c503577ec14800906e75ad: Status 404 returned error can't find the container with id ffb92b6294a3f113e77f8964fbd84505efd5118e48c503577ec14800906e75ad Mar 10 22:14:37 crc kubenswrapper[4919]: I0310 22:14:37.272238 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:14:37 crc kubenswrapper[4919]: I0310 22:14:37.331670 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:14:37 crc kubenswrapper[4919]: W0310 22:14:37.344242 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaff6348f_a0cf_4b67_a072_edcde9dcb3c4.slice/crio-6550cd34ff7b7082c837f1cb6d134519d4a94b56191ff750893f0bcc73b52c00 WatchSource:0}: Error finding container 6550cd34ff7b7082c837f1cb6d134519d4a94b56191ff750893f0bcc73b52c00: Status 404 returned error can't find the container with id 6550cd34ff7b7082c837f1cb6d134519d4a94b56191ff750893f0bcc73b52c00 Mar 10 22:14:37 crc kubenswrapper[4919]: I0310 22:14:37.493974 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40528b7c-2c67-4413-ba07-5e3e6af9c18b" path="/var/lib/kubelet/pods/40528b7c-2c67-4413-ba07-5e3e6af9c18b/volumes" Mar 10 22:14:37 crc kubenswrapper[4919]: I0310 22:14:37.494652 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a194fe14-439b-4a8a-acb8-ea5852e0721b" path="/var/lib/kubelet/pods/a194fe14-439b-4a8a-acb8-ea5852e0721b/volumes" Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.128657 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb1090fb-ea87-4cce-a1fb-05cb31306efa","Type":"ContainerStarted","Data":"6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98"} Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.128972 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb1090fb-ea87-4cce-a1fb-05cb31306efa","Type":"ContainerStarted","Data":"ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce"} Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.128984 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb1090fb-ea87-4cce-a1fb-05cb31306efa","Type":"ContainerStarted","Data":"ffb92b6294a3f113e77f8964fbd84505efd5118e48c503577ec14800906e75ad"} Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.130256 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aff6348f-a0cf-4b67-a072-edcde9dcb3c4","Type":"ContainerStarted","Data":"d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc"} Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.132289 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aff6348f-a0cf-4b67-a072-edcde9dcb3c4","Type":"ContainerStarted","Data":"6550cd34ff7b7082c837f1cb6d134519d4a94b56191ff750893f0bcc73b52c00"} Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.154619 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.154603708 podStartE2EDuration="2.154603708s" podCreationTimestamp="2026-03-10 22:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:38.152778599 +0000 UTC m=+1465.394659217" watchObservedRunningTime="2026-03-10 22:14:38.154603708 +0000 UTC m=+1465.396484316" Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.178038 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.178015943 podStartE2EDuration="2.178015943s" podCreationTimestamp="2026-03-10 22:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:38.174648801 +0000 UTC m=+1465.416529429" watchObservedRunningTime="2026-03-10 22:14:38.178015943 +0000 UTC m=+1465.419896551" Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.614032 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.614745 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.616247 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 22:14:38 crc kubenswrapper[4919]: I0310 22:14:38.616762 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.139269 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.142173 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.307970 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-nbpbz"] Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.309819 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.322512 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-nbpbz"] Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.444970 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-config\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.445079 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-swift-storage-0\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.445132 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.445176 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-svc\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.445194 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvpdr\" (UniqueName: \"kubernetes.io/projected/37dac1c8-963f-466f-977e-37b2fd98d32c-kube-api-access-gvpdr\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.445252 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.546614 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-config\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.546704 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-swift-storage-0\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.546762 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.547641 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-config\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.547656 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-swift-storage-0\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.547930 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-sb\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.548002 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-svc\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.548676 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-svc\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.548033 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvpdr\" (UniqueName: \"kubernetes.io/projected/37dac1c8-963f-466f-977e-37b2fd98d32c-kube-api-access-gvpdr\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.548788 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.549632 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.579808 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvpdr\" (UniqueName: \"kubernetes.io/projected/37dac1c8-963f-466f-977e-37b2fd98d32c-kube-api-access-gvpdr\") pod \"dnsmasq-dns-9dd56c4d5-nbpbz\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:39 crc kubenswrapper[4919]: I0310 22:14:39.634193 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:40 crc kubenswrapper[4919]: I0310 22:14:40.415634 4919 scope.go:117] "RemoveContainer" containerID="67790202df1a8a341aa2320da216ec0d6f4a974001206eb49de031e559a5355d" Mar 10 22:14:40 crc kubenswrapper[4919]: I0310 22:14:40.837931 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-nbpbz"] Mar 10 22:14:40 crc kubenswrapper[4919]: W0310 22:14:40.838377 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37dac1c8_963f_466f_977e_37b2fd98d32c.slice/crio-cb55c7a40740c9b346700a78eed74baf6304c24976baed861b49a89477f86eb7 WatchSource:0}: Error finding container cb55c7a40740c9b346700a78eed74baf6304c24976baed861b49a89477f86eb7: Status 404 returned error can't find the container with id cb55c7a40740c9b346700a78eed74baf6304c24976baed861b49a89477f86eb7 Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.169912 4919 generic.go:334] "Generic (PLEG): container finished" podID="37dac1c8-963f-466f-977e-37b2fd98d32c" containerID="7d17cb532f3759c11c504765e96e40250ed2fc93714c74cd7c4b67a68350db0b" exitCode=0 Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.170036 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" event={"ID":"37dac1c8-963f-466f-977e-37b2fd98d32c","Type":"ContainerDied","Data":"7d17cb532f3759c11c504765e96e40250ed2fc93714c74cd7c4b67a68350db0b"} Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.170253 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" event={"ID":"37dac1c8-963f-466f-977e-37b2fd98d32c","Type":"ContainerStarted","Data":"cb55c7a40740c9b346700a78eed74baf6304c24976baed861b49a89477f86eb7"} Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.474412 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.474905 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="ceilometer-central-agent" containerID="cri-o://277e57562cee12fa7ec9cd814300875d070319f6b06a2201258191489b069a76" gracePeriod=30 Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.475287 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="proxy-httpd" containerID="cri-o://c632fff9503baa94e2de04d3b24fb507386b4cee0ac9b0604aefc6e9ec7d192e" gracePeriod=30 Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.475343 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="sg-core" containerID="cri-o://38e5b0c62a33d19485359f3bc8aa610f75dd2eacaadd6626fc7201f23142cfe4" gracePeriod=30 Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.475372 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="ceilometer-notification-agent" containerID="cri-o://5e17384c3443979b8dcda5b3b440dc722ffa6e656bd894db4223ec39ae6dfc5d" gracePeriod=30 Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.499079 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.818515 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.818581 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 22:14:41 crc kubenswrapper[4919]: I0310 22:14:41.839076 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.093928 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.181110 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" event={"ID":"37dac1c8-963f-466f-977e-37b2fd98d32c","Type":"ContainerStarted","Data":"e21cf6248decf29aa992c186881d057311c63ce4f642d28d2d190eaeaf041479"} Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.181515 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185521 4919 generic.go:334] "Generic (PLEG): container finished" podID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerID="c632fff9503baa94e2de04d3b24fb507386b4cee0ac9b0604aefc6e9ec7d192e" exitCode=0 Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185548 4919 generic.go:334] "Generic (PLEG): container finished" podID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerID="38e5b0c62a33d19485359f3bc8aa610f75dd2eacaadd6626fc7201f23142cfe4" exitCode=2 Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185555 4919 generic.go:334] "Generic (PLEG): container finished" podID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerID="5e17384c3443979b8dcda5b3b440dc722ffa6e656bd894db4223ec39ae6dfc5d" exitCode=0 Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185562 4919 generic.go:334] "Generic (PLEG): container finished" podID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerID="277e57562cee12fa7ec9cd814300875d070319f6b06a2201258191489b069a76" exitCode=0 Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185585 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerDied","Data":"c632fff9503baa94e2de04d3b24fb507386b4cee0ac9b0604aefc6e9ec7d192e"} Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185653 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerDied","Data":"38e5b0c62a33d19485359f3bc8aa610f75dd2eacaadd6626fc7201f23142cfe4"} Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185663 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerDied","Data":"5e17384c3443979b8dcda5b3b440dc722ffa6e656bd894db4223ec39ae6dfc5d"} Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185673 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerDied","Data":"277e57562cee12fa7ec9cd814300875d070319f6b06a2201258191489b069a76"} Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.185711 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-log" containerID="cri-o://55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33" gracePeriod=30 Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.186025 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-api" containerID="cri-o://cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01" gracePeriod=30 Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.209724 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" podStartSLOduration=3.209704886 podStartE2EDuration="3.209704886s" podCreationTimestamp="2026-03-10 22:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:42.20098636 +0000 UTC m=+1469.442866968" watchObservedRunningTime="2026-03-10 22:14:42.209704886 +0000 UTC m=+1469.451585494" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.519655 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.620284 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-config-data\") pod \"76b0dfed-aec2-4594-84f7-15f4489d87e4\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.620892 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-combined-ca-bundle\") pod \"76b0dfed-aec2-4594-84f7-15f4489d87e4\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.620988 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5ftw\" (UniqueName: \"kubernetes.io/projected/76b0dfed-aec2-4594-84f7-15f4489d87e4-kube-api-access-r5ftw\") pod \"76b0dfed-aec2-4594-84f7-15f4489d87e4\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.621066 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-log-httpd\") pod \"76b0dfed-aec2-4594-84f7-15f4489d87e4\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.621161 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-run-httpd\") pod \"76b0dfed-aec2-4594-84f7-15f4489d87e4\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.621377 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-ceilometer-tls-certs\") pod \"76b0dfed-aec2-4594-84f7-15f4489d87e4\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.621517 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-sg-core-conf-yaml\") pod \"76b0dfed-aec2-4594-84f7-15f4489d87e4\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.621572 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "76b0dfed-aec2-4594-84f7-15f4489d87e4" (UID: "76b0dfed-aec2-4594-84f7-15f4489d87e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.621687 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-scripts\") pod \"76b0dfed-aec2-4594-84f7-15f4489d87e4\" (UID: \"76b0dfed-aec2-4594-84f7-15f4489d87e4\") " Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.622534 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.625573 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "76b0dfed-aec2-4594-84f7-15f4489d87e4" (UID: "76b0dfed-aec2-4594-84f7-15f4489d87e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.633543 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b0dfed-aec2-4594-84f7-15f4489d87e4-kube-api-access-r5ftw" (OuterVolumeSpecName: "kube-api-access-r5ftw") pod "76b0dfed-aec2-4594-84f7-15f4489d87e4" (UID: "76b0dfed-aec2-4594-84f7-15f4489d87e4"). InnerVolumeSpecName "kube-api-access-r5ftw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.633972 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-scripts" (OuterVolumeSpecName: "scripts") pod "76b0dfed-aec2-4594-84f7-15f4489d87e4" (UID: "76b0dfed-aec2-4594-84f7-15f4489d87e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.651868 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "76b0dfed-aec2-4594-84f7-15f4489d87e4" (UID: "76b0dfed-aec2-4594-84f7-15f4489d87e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.683940 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "76b0dfed-aec2-4594-84f7-15f4489d87e4" (UID: "76b0dfed-aec2-4594-84f7-15f4489d87e4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.698246 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76b0dfed-aec2-4594-84f7-15f4489d87e4" (UID: "76b0dfed-aec2-4594-84f7-15f4489d87e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.724761 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.724805 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5ftw\" (UniqueName: \"kubernetes.io/projected/76b0dfed-aec2-4594-84f7-15f4489d87e4-kube-api-access-r5ftw\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.724817 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76b0dfed-aec2-4594-84f7-15f4489d87e4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.724827 4919 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.724836 4919 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.724845 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.729370 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-config-data" (OuterVolumeSpecName: "config-data") pod "76b0dfed-aec2-4594-84f7-15f4489d87e4" (UID: "76b0dfed-aec2-4594-84f7-15f4489d87e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:42 crc kubenswrapper[4919]: I0310 22:14:42.826860 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76b0dfed-aec2-4594-84f7-15f4489d87e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.197074 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.197039 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76b0dfed-aec2-4594-84f7-15f4489d87e4","Type":"ContainerDied","Data":"152192f4284b66b713a93ee5ef3eec0262ea83f63b3b32f7aa324997e46cfe46"} Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.197411 4919 scope.go:117] "RemoveContainer" containerID="c632fff9503baa94e2de04d3b24fb507386b4cee0ac9b0604aefc6e9ec7d192e" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.199240 4919 generic.go:334] "Generic (PLEG): container finished" podID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerID="55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33" exitCode=143 Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.199332 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab170d36-cd71-428b-bb07-0b7192035f1d","Type":"ContainerDied","Data":"55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33"} Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.226446 4919 scope.go:117] "RemoveContainer" containerID="38e5b0c62a33d19485359f3bc8aa610f75dd2eacaadd6626fc7201f23142cfe4" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.238156 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.253484 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.264338 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:43 crc kubenswrapper[4919]: E0310 22:14:43.264857 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="sg-core" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.264877 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="sg-core" Mar 10 22:14:43 crc kubenswrapper[4919]: E0310 22:14:43.264918 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="ceilometer-notification-agent" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.264927 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="ceilometer-notification-agent" Mar 10 22:14:43 crc kubenswrapper[4919]: E0310 22:14:43.264943 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="proxy-httpd" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.264950 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="proxy-httpd" Mar 10 22:14:43 crc kubenswrapper[4919]: E0310 22:14:43.264989 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="ceilometer-central-agent" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.264999 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="ceilometer-central-agent" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.265340 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="proxy-httpd" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.265367 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="sg-core" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.265445 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="ceilometer-notification-agent" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.265459 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" containerName="ceilometer-central-agent" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.265884 4919 scope.go:117] "RemoveContainer" containerID="5e17384c3443979b8dcda5b3b440dc722ffa6e656bd894db4223ec39ae6dfc5d" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.279361 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.283294 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.283296 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.299987 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.304095 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.316370 4919 scope.go:117] "RemoveContainer" containerID="277e57562cee12fa7ec9cd814300875d070319f6b06a2201258191489b069a76" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.351313 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:43 crc kubenswrapper[4919]: E0310 22:14:43.352269 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-cpxqm log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-cpxqm log-httpd run-httpd scripts sg-core-conf-yaml]: context canceled" pod="openstack/ceilometer-0" podUID="5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.438284 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.438326 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.438368 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.438422 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.438468 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-scripts\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.438490 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.438530 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-config-data\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.438556 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpxqm\" (UniqueName: \"kubernetes.io/projected/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-kube-api-access-cpxqm\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.491898 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b0dfed-aec2-4594-84f7-15f4489d87e4" path="/var/lib/kubelet/pods/76b0dfed-aec2-4594-84f7-15f4489d87e4/volumes" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.539987 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.540573 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.541271 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-scripts\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.541366 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.541489 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-config-data\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.541598 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpxqm\" (UniqueName: \"kubernetes.io/projected/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-kube-api-access-cpxqm\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.540520 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.541721 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.541823 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.542086 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.546273 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.546952 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-config-data\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.547762 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-scripts\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.548247 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.560185 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpxqm\" (UniqueName: \"kubernetes.io/projected/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-kube-api-access-cpxqm\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:43 crc kubenswrapper[4919]: I0310 22:14:43.565383 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " pod="openstack/ceilometer-0" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.208911 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.222913 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.356960 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpxqm\" (UniqueName: \"kubernetes.io/projected/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-kube-api-access-cpxqm\") pod \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357049 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-log-httpd\") pod \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357147 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-config-data\") pod \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357221 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-run-httpd\") pod \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357244 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-ceilometer-tls-certs\") pod \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357268 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-sg-core-conf-yaml\") pod \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357514 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" (UID: "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357564 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" (UID: "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357671 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-scripts\") pod \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.357764 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-combined-ca-bundle\") pod \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\" (UID: \"5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe\") " Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.358252 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.358269 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.361087 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-kube-api-access-cpxqm" (OuterVolumeSpecName: "kube-api-access-cpxqm") pod "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" (UID: "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe"). InnerVolumeSpecName "kube-api-access-cpxqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.363170 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-config-data" (OuterVolumeSpecName: "config-data") pod "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" (UID: "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.365156 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" (UID: "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.375716 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-scripts" (OuterVolumeSpecName: "scripts") pod "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" (UID: "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.376764 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" (UID: "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.377251 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" (UID: "5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.460055 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.460465 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.460601 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpxqm\" (UniqueName: \"kubernetes.io/projected/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-kube-api-access-cpxqm\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.460711 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.460815 4919 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:44 crc kubenswrapper[4919]: I0310 22:14:44.460943 4919 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.224676 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.293256 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.311948 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.327150 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.329736 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.334726 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.334746 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.335038 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.336578 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.481211 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.481275 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.481322 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.481364 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-log-httpd\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.481405 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-config-data\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.481469 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-run-httpd\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.481493 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-scripts\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.481555 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bq6r\" (UniqueName: \"kubernetes.io/projected/1f324194-64d5-4755-847b-f554b94e652c-kube-api-access-7bq6r\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.501671 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe" path="/var/lib/kubelet/pods/5e3aa364-4f6b-4f53-97fe-4d0a98b04fbe/volumes" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.587294 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bq6r\" (UniqueName: \"kubernetes.io/projected/1f324194-64d5-4755-847b-f554b94e652c-kube-api-access-7bq6r\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.587422 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.587476 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.587517 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.587550 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-log-httpd\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.587568 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-config-data\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.587589 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-run-httpd\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.587608 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-scripts\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.588223 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-run-httpd\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.588245 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-log-httpd\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.594364 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-scripts\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.597462 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.597493 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.599384 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.604801 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-config-data\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.608128 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bq6r\" (UniqueName: \"kubernetes.io/projected/1f324194-64d5-4755-847b-f554b94e652c-kube-api-access-7bq6r\") pod \"ceilometer-0\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.746081 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.783568 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.893429 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab170d36-cd71-428b-bb07-0b7192035f1d-logs\") pod \"ab170d36-cd71-428b-bb07-0b7192035f1d\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.893721 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-config-data\") pod \"ab170d36-cd71-428b-bb07-0b7192035f1d\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.893752 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcmpx\" (UniqueName: \"kubernetes.io/projected/ab170d36-cd71-428b-bb07-0b7192035f1d-kube-api-access-rcmpx\") pod \"ab170d36-cd71-428b-bb07-0b7192035f1d\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.893797 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-combined-ca-bundle\") pod \"ab170d36-cd71-428b-bb07-0b7192035f1d\" (UID: \"ab170d36-cd71-428b-bb07-0b7192035f1d\") " Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.894278 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab170d36-cd71-428b-bb07-0b7192035f1d-logs" (OuterVolumeSpecName: "logs") pod "ab170d36-cd71-428b-bb07-0b7192035f1d" (UID: "ab170d36-cd71-428b-bb07-0b7192035f1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.900646 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab170d36-cd71-428b-bb07-0b7192035f1d-kube-api-access-rcmpx" (OuterVolumeSpecName: "kube-api-access-rcmpx") pod "ab170d36-cd71-428b-bb07-0b7192035f1d" (UID: "ab170d36-cd71-428b-bb07-0b7192035f1d"). InnerVolumeSpecName "kube-api-access-rcmpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.933463 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-config-data" (OuterVolumeSpecName: "config-data") pod "ab170d36-cd71-428b-bb07-0b7192035f1d" (UID: "ab170d36-cd71-428b-bb07-0b7192035f1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.937485 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab170d36-cd71-428b-bb07-0b7192035f1d" (UID: "ab170d36-cd71-428b-bb07-0b7192035f1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.995653 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.995945 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcmpx\" (UniqueName: \"kubernetes.io/projected/ab170d36-cd71-428b-bb07-0b7192035f1d-kube-api-access-rcmpx\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.995959 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab170d36-cd71-428b-bb07-0b7192035f1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:45 crc kubenswrapper[4919]: I0310 22:14:45.995970 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab170d36-cd71-428b-bb07-0b7192035f1d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.227950 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:14:46 crc kubenswrapper[4919]: W0310 22:14:46.230202 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f324194_64d5_4755_847b_f554b94e652c.slice/crio-ab56ca52c6208a6b06a1615711c68c53b88659f15dafdf22e213fb0d75084cb2 WatchSource:0}: Error finding container ab56ca52c6208a6b06a1615711c68c53b88659f15dafdf22e213fb0d75084cb2: Status 404 returned error can't find the container with id ab56ca52c6208a6b06a1615711c68c53b88659f15dafdf22e213fb0d75084cb2 Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.235248 4919 generic.go:334] "Generic (PLEG): container finished" podID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerID="cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01" exitCode=0 Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.235288 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab170d36-cd71-428b-bb07-0b7192035f1d","Type":"ContainerDied","Data":"cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01"} Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.235325 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab170d36-cd71-428b-bb07-0b7192035f1d","Type":"ContainerDied","Data":"6538328591ed81061881cb63e936aadef6c5994ff40a297deec4e853f5f1aa46"} Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.235325 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.235345 4919 scope.go:117] "RemoveContainer" containerID="cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.255338 4919 scope.go:117] "RemoveContainer" containerID="55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.272590 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.275123 4919 scope.go:117] "RemoveContainer" containerID="cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01" Mar 10 22:14:46 crc kubenswrapper[4919]: E0310 22:14:46.275685 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01\": container with ID starting with cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01 not found: ID does not exist" containerID="cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.275719 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01"} err="failed to get container status \"cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01\": rpc error: code = NotFound desc = could not find container \"cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01\": container with ID starting with cb94664fd4e7227fd3975dcc4218c4979fab33261ec675f8add916f14eb7ad01 not found: ID does not exist" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.275740 4919 scope.go:117] "RemoveContainer" containerID="55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33" Mar 10 22:14:46 crc kubenswrapper[4919]: E0310 22:14:46.276044 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33\": container with ID starting with 55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33 not found: ID does not exist" containerID="55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.276083 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33"} err="failed to get container status \"55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33\": rpc error: code = NotFound desc = could not find container \"55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33\": container with ID starting with 55aa25e6e7ad7a9818d2e114c2acded9548467e06b715e3ecbb44ddde875da33 not found: ID does not exist" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.302591 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.314259 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:46 crc kubenswrapper[4919]: E0310 22:14:46.314721 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-log" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.314741 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-log" Mar 10 22:14:46 crc kubenswrapper[4919]: E0310 22:14:46.314766 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-api" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.314773 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-api" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.314940 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-log" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.314966 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" containerName="nova-api-api" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.315878 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.327293 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.328815 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.337273 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.340988 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.409605 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r689\" (UniqueName: \"kubernetes.io/projected/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-kube-api-access-2r689\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.409883 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-logs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.409905 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-config-data\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.409934 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.409971 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-internal-tls-certs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.410031 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-public-tls-certs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.512297 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r689\" (UniqueName: \"kubernetes.io/projected/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-kube-api-access-2r689\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.512371 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-logs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.512430 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-config-data\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.512470 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.512517 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-internal-tls-certs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.512541 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-public-tls-certs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.512824 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-logs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.518093 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.518128 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-config-data\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.518161 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-public-tls-certs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.519776 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-internal-tls-certs\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.530090 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r689\" (UniqueName: \"kubernetes.io/projected/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-kube-api-access-2r689\") pod \"nova-api-0\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.643164 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.817807 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.817866 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.838493 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:46 crc kubenswrapper[4919]: I0310 22:14:46.866255 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:47 crc kubenswrapper[4919]: W0310 22:14:47.156965 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45cb0e95_6f81_41a4_ae3a_a0e499ba1973.slice/crio-38a21be2ddb15576d8f97a4464784b69b0f4bfba9238f897215fb817ddd9d50e WatchSource:0}: Error finding container 38a21be2ddb15576d8f97a4464784b69b0f4bfba9238f897215fb817ddd9d50e: Status 404 returned error can't find the container with id 38a21be2ddb15576d8f97a4464784b69b0f4bfba9238f897215fb817ddd9d50e Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.158323 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.246049 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerStarted","Data":"69fdc3c8e2ab199f6bb93d9e3a1a78edfb949241e41f07473825a631528f1dde"} Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.246091 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerStarted","Data":"ab56ca52c6208a6b06a1615711c68c53b88659f15dafdf22e213fb0d75084cb2"} Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.254353 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45cb0e95-6f81-41a4-ae3a-a0e499ba1973","Type":"ContainerStarted","Data":"38a21be2ddb15576d8f97a4464784b69b0f4bfba9238f897215fb817ddd9d50e"} Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.277988 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.498477 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab170d36-cd71-428b-bb07-0b7192035f1d" path="/var/lib/kubelet/pods/ab170d36-cd71-428b-bb07-0b7192035f1d/volumes" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.502273 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-x2nwk"] Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.505449 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.507639 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.507849 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.518126 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-x2nwk"] Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.637539 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xdn9\" (UniqueName: \"kubernetes.io/projected/6da154ef-0415-4972-8282-cf5161c8fa71-kube-api-access-5xdn9\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.637764 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-scripts\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.637936 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.638010 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-config-data\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.739353 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-config-data\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.739541 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xdn9\" (UniqueName: \"kubernetes.io/projected/6da154ef-0415-4972-8282-cf5161c8fa71-kube-api-access-5xdn9\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.739613 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-scripts\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.739652 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.743109 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.743141 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-config-data\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.743519 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-scripts\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.764817 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xdn9\" (UniqueName: \"kubernetes.io/projected/6da154ef-0415-4972-8282-cf5161c8fa71-kube-api-access-5xdn9\") pod \"nova-cell1-cell-mapping-x2nwk\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.832654 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.832662 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 22:14:47 crc kubenswrapper[4919]: I0310 22:14:47.975381 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:48 crc kubenswrapper[4919]: I0310 22:14:48.267365 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerStarted","Data":"4f27c36666ba7ecf2d24cedee59efe2a08c7b4e6c86f4fe4198918504c6bf578"} Mar 10 22:14:48 crc kubenswrapper[4919]: I0310 22:14:48.270008 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45cb0e95-6f81-41a4-ae3a-a0e499ba1973","Type":"ContainerStarted","Data":"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213"} Mar 10 22:14:48 crc kubenswrapper[4919]: I0310 22:14:48.270056 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45cb0e95-6f81-41a4-ae3a-a0e499ba1973","Type":"ContainerStarted","Data":"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48"} Mar 10 22:14:48 crc kubenswrapper[4919]: I0310 22:14:48.297187 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.297171378 podStartE2EDuration="2.297171378s" podCreationTimestamp="2026-03-10 22:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:48.29687765 +0000 UTC m=+1475.538758258" watchObservedRunningTime="2026-03-10 22:14:48.297171378 +0000 UTC m=+1475.539051986" Mar 10 22:14:48 crc kubenswrapper[4919]: I0310 22:14:48.436382 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-x2nwk"] Mar 10 22:14:48 crc kubenswrapper[4919]: W0310 22:14:48.466865 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6da154ef_0415_4972_8282_cf5161c8fa71.slice/crio-109c3cd554496e84cba4bf9eebc8a8592736092999ff3296aadf983dd0ff447a WatchSource:0}: Error finding container 109c3cd554496e84cba4bf9eebc8a8592736092999ff3296aadf983dd0ff447a: Status 404 returned error can't find the container with id 109c3cd554496e84cba4bf9eebc8a8592736092999ff3296aadf983dd0ff447a Mar 10 22:14:49 crc kubenswrapper[4919]: I0310 22:14:49.282521 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x2nwk" event={"ID":"6da154ef-0415-4972-8282-cf5161c8fa71","Type":"ContainerStarted","Data":"7f26055d95c56893f487c89ea638cb9eb78e36cad1498490bf8ab54a3e7bac8c"} Mar 10 22:14:49 crc kubenswrapper[4919]: I0310 22:14:49.282852 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x2nwk" event={"ID":"6da154ef-0415-4972-8282-cf5161c8fa71","Type":"ContainerStarted","Data":"109c3cd554496e84cba4bf9eebc8a8592736092999ff3296aadf983dd0ff447a"} Mar 10 22:14:49 crc kubenswrapper[4919]: I0310 22:14:49.286193 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerStarted","Data":"c84f2ef693d15394b292c9104df5d25964c0ba501f0d8a5f3f3a4710e2c7b5e1"} Mar 10 22:14:49 crc kubenswrapper[4919]: I0310 22:14:49.309593 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-x2nwk" podStartSLOduration=2.309567877 podStartE2EDuration="2.309567877s" podCreationTimestamp="2026-03-10 22:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:49.299930686 +0000 UTC m=+1476.541811294" watchObservedRunningTime="2026-03-10 22:14:49.309567877 +0000 UTC m=+1476.551448515" Mar 10 22:14:49 crc kubenswrapper[4919]: I0310 22:14:49.635763 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:14:49 crc kubenswrapper[4919]: I0310 22:14:49.711592 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-f9vxh"] Mar 10 22:14:49 crc kubenswrapper[4919]: I0310 22:14:49.711870 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" podUID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerName="dnsmasq-dns" containerID="cri-o://79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849" gracePeriod=10 Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.213216 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.295235 4919 generic.go:334] "Generic (PLEG): container finished" podID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerID="79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849" exitCode=0 Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.295382 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.296502 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" event={"ID":"c696db8f-44d0-42ad-aa56-5e889eef767f","Type":"ContainerDied","Data":"79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849"} Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.296540 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" event={"ID":"c696db8f-44d0-42ad-aa56-5e889eef767f","Type":"ContainerDied","Data":"2318e4f3647fd69ecdd613964e273d02069cbc4fb038b1136fbad533fe7da705"} Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.296563 4919 scope.go:117] "RemoveContainer" containerID="79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.301766 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-svc\") pod \"c696db8f-44d0-42ad-aa56-5e889eef767f\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.301938 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-config\") pod \"c696db8f-44d0-42ad-aa56-5e889eef767f\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.301976 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-sb\") pod \"c696db8f-44d0-42ad-aa56-5e889eef767f\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.302030 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-nb\") pod \"c696db8f-44d0-42ad-aa56-5e889eef767f\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.302062 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qw4h\" (UniqueName: \"kubernetes.io/projected/c696db8f-44d0-42ad-aa56-5e889eef767f-kube-api-access-5qw4h\") pod \"c696db8f-44d0-42ad-aa56-5e889eef767f\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.302117 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-swift-storage-0\") pod \"c696db8f-44d0-42ad-aa56-5e889eef767f\" (UID: \"c696db8f-44d0-42ad-aa56-5e889eef767f\") " Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.313564 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c696db8f-44d0-42ad-aa56-5e889eef767f-kube-api-access-5qw4h" (OuterVolumeSpecName: "kube-api-access-5qw4h") pod "c696db8f-44d0-42ad-aa56-5e889eef767f" (UID: "c696db8f-44d0-42ad-aa56-5e889eef767f"). InnerVolumeSpecName "kube-api-access-5qw4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.327127 4919 scope.go:117] "RemoveContainer" containerID="c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.362151 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c696db8f-44d0-42ad-aa56-5e889eef767f" (UID: "c696db8f-44d0-42ad-aa56-5e889eef767f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.367918 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-config" (OuterVolumeSpecName: "config") pod "c696db8f-44d0-42ad-aa56-5e889eef767f" (UID: "c696db8f-44d0-42ad-aa56-5e889eef767f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.403911 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.403936 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.403946 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qw4h\" (UniqueName: \"kubernetes.io/projected/c696db8f-44d0-42ad-aa56-5e889eef767f-kube-api-access-5qw4h\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.404243 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c696db8f-44d0-42ad-aa56-5e889eef767f" (UID: "c696db8f-44d0-42ad-aa56-5e889eef767f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.420671 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c696db8f-44d0-42ad-aa56-5e889eef767f" (UID: "c696db8f-44d0-42ad-aa56-5e889eef767f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.422945 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c696db8f-44d0-42ad-aa56-5e889eef767f" (UID: "c696db8f-44d0-42ad-aa56-5e889eef767f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.440535 4919 scope.go:117] "RemoveContainer" containerID="79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849" Mar 10 22:14:50 crc kubenswrapper[4919]: E0310 22:14:50.440997 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849\": container with ID starting with 79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849 not found: ID does not exist" containerID="79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.441030 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849"} err="failed to get container status \"79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849\": rpc error: code = NotFound desc = could not find container \"79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849\": container with ID starting with 79b5348b47198913c64dcba35b3e10b6d963ff43c6b3676ac87eda8bf5f41849 not found: ID does not exist" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.441141 4919 scope.go:117] "RemoveContainer" containerID="c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b" Mar 10 22:14:50 crc kubenswrapper[4919]: E0310 22:14:50.441433 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b\": container with ID starting with c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b not found: ID does not exist" containerID="c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.441456 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b"} err="failed to get container status \"c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b\": rpc error: code = NotFound desc = could not find container \"c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b\": container with ID starting with c44074a20c53eb2ed6f9c4abf0f3fd5c15a98589633492a62792d7ca376ffb2b not found: ID does not exist" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.505197 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.505471 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.505487 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c696db8f-44d0-42ad-aa56-5e889eef767f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.642151 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-f9vxh"] Mar 10 22:14:50 crc kubenswrapper[4919]: I0310 22:14:50.654729 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-97cdf8549-f9vxh"] Mar 10 22:14:51 crc kubenswrapper[4919]: I0310 22:14:51.307353 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerStarted","Data":"8dd9c6db1ef3f3090c173c377cd48fad2c1b903961ef2c219973ba650ae92aa3"} Mar 10 22:14:51 crc kubenswrapper[4919]: I0310 22:14:51.308622 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 22:14:51 crc kubenswrapper[4919]: I0310 22:14:51.343649 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.083096345 podStartE2EDuration="6.343624208s" podCreationTimestamp="2026-03-10 22:14:45 +0000 UTC" firstStartedPulling="2026-03-10 22:14:46.233213958 +0000 UTC m=+1473.475094566" lastFinishedPulling="2026-03-10 22:14:50.493741821 +0000 UTC m=+1477.735622429" observedRunningTime="2026-03-10 22:14:51.335050156 +0000 UTC m=+1478.576930794" watchObservedRunningTime="2026-03-10 22:14:51.343624208 +0000 UTC m=+1478.585504816" Mar 10 22:14:51 crc kubenswrapper[4919]: I0310 22:14:51.490783 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c696db8f-44d0-42ad-aa56-5e889eef767f" path="/var/lib/kubelet/pods/c696db8f-44d0-42ad-aa56-5e889eef767f/volumes" Mar 10 22:14:54 crc kubenswrapper[4919]: I0310 22:14:54.335641 4919 generic.go:334] "Generic (PLEG): container finished" podID="6da154ef-0415-4972-8282-cf5161c8fa71" containerID="7f26055d95c56893f487c89ea638cb9eb78e36cad1498490bf8ab54a3e7bac8c" exitCode=0 Mar 10 22:14:54 crc kubenswrapper[4919]: I0310 22:14:54.335692 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x2nwk" event={"ID":"6da154ef-0415-4972-8282-cf5161c8fa71","Type":"ContainerDied","Data":"7f26055d95c56893f487c89ea638cb9eb78e36cad1498490bf8ab54a3e7bac8c"} Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.194623 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-97cdf8549-f9vxh" podUID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: i/o timeout" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.680182 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.744937 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-scripts\") pod \"6da154ef-0415-4972-8282-cf5161c8fa71\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.745026 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-config-data\") pod \"6da154ef-0415-4972-8282-cf5161c8fa71\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.745125 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-combined-ca-bundle\") pod \"6da154ef-0415-4972-8282-cf5161c8fa71\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.745469 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xdn9\" (UniqueName: \"kubernetes.io/projected/6da154ef-0415-4972-8282-cf5161c8fa71-kube-api-access-5xdn9\") pod \"6da154ef-0415-4972-8282-cf5161c8fa71\" (UID: \"6da154ef-0415-4972-8282-cf5161c8fa71\") " Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.753242 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da154ef-0415-4972-8282-cf5161c8fa71-kube-api-access-5xdn9" (OuterVolumeSpecName: "kube-api-access-5xdn9") pod "6da154ef-0415-4972-8282-cf5161c8fa71" (UID: "6da154ef-0415-4972-8282-cf5161c8fa71"). InnerVolumeSpecName "kube-api-access-5xdn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.762598 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-scripts" (OuterVolumeSpecName: "scripts") pod "6da154ef-0415-4972-8282-cf5161c8fa71" (UID: "6da154ef-0415-4972-8282-cf5161c8fa71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.780282 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6da154ef-0415-4972-8282-cf5161c8fa71" (UID: "6da154ef-0415-4972-8282-cf5161c8fa71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.796757 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-config-data" (OuterVolumeSpecName: "config-data") pod "6da154ef-0415-4972-8282-cf5161c8fa71" (UID: "6da154ef-0415-4972-8282-cf5161c8fa71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.847752 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.847784 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.847795 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da154ef-0415-4972-8282-cf5161c8fa71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:55 crc kubenswrapper[4919]: I0310 22:14:55.847806 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xdn9\" (UniqueName: \"kubernetes.io/projected/6da154ef-0415-4972-8282-cf5161c8fa71-kube-api-access-5xdn9\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.368293 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x2nwk" event={"ID":"6da154ef-0415-4972-8282-cf5161c8fa71","Type":"ContainerDied","Data":"109c3cd554496e84cba4bf9eebc8a8592736092999ff3296aadf983dd0ff447a"} Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.368596 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="109c3cd554496e84cba4bf9eebc8a8592736092999ff3296aadf983dd0ff447a" Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.368381 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x2nwk" Mar 10 22:14:56 crc kubenswrapper[4919]: E0310 22:14:56.413219 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6da154ef_0415_4972_8282_cf5161c8fa71.slice\": RecentStats: unable to find data in memory cache]" Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.533973 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.534432 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerName="nova-api-log" containerID="cri-o://4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48" gracePeriod=30 Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.534469 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerName="nova-api-api" containerID="cri-o://c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213" gracePeriod=30 Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.546264 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.546525 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="64c699c8-f2b1-421d-b617-ec6b7b7c806b" containerName="nova-scheduler-scheduler" containerID="cri-o://55b9f720f76d4e27cdd8fce568c10b873fab76107f554ff240b1936a023cca7d" gracePeriod=30 Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.616796 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.617078 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-log" containerID="cri-o://ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce" gracePeriod=30 Mar 10 22:14:56 crc kubenswrapper[4919]: I0310 22:14:56.617535 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-metadata" containerID="cri-o://6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98" gracePeriod=30 Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.110973 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.170993 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-internal-tls-certs\") pod \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.171119 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r689\" (UniqueName: \"kubernetes.io/projected/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-kube-api-access-2r689\") pod \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.171147 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-combined-ca-bundle\") pod \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.171341 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-logs\") pod \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.171508 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-public-tls-certs\") pod \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.171573 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-config-data\") pod \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\" (UID: \"45cb0e95-6f81-41a4-ae3a-a0e499ba1973\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.171854 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-logs" (OuterVolumeSpecName: "logs") pod "45cb0e95-6f81-41a4-ae3a-a0e499ba1973" (UID: "45cb0e95-6f81-41a4-ae3a-a0e499ba1973"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.172107 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.179631 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-kube-api-access-2r689" (OuterVolumeSpecName: "kube-api-access-2r689") pod "45cb0e95-6f81-41a4-ae3a-a0e499ba1973" (UID: "45cb0e95-6f81-41a4-ae3a-a0e499ba1973"). InnerVolumeSpecName "kube-api-access-2r689". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.198719 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-config-data" (OuterVolumeSpecName: "config-data") pod "45cb0e95-6f81-41a4-ae3a-a0e499ba1973" (UID: "45cb0e95-6f81-41a4-ae3a-a0e499ba1973"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.227368 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45cb0e95-6f81-41a4-ae3a-a0e499ba1973" (UID: "45cb0e95-6f81-41a4-ae3a-a0e499ba1973"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.228578 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "45cb0e95-6f81-41a4-ae3a-a0e499ba1973" (UID: "45cb0e95-6f81-41a4-ae3a-a0e499ba1973"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.237003 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "45cb0e95-6f81-41a4-ae3a-a0e499ba1973" (UID: "45cb0e95-6f81-41a4-ae3a-a0e499ba1973"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.273502 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r689\" (UniqueName: \"kubernetes.io/projected/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-kube-api-access-2r689\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.273531 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.273540 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.273549 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.273558 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cb0e95-6f81-41a4-ae3a-a0e499ba1973-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.379090 4919 generic.go:334] "Generic (PLEG): container finished" podID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerID="ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce" exitCode=143 Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.379155 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb1090fb-ea87-4cce-a1fb-05cb31306efa","Type":"ContainerDied","Data":"ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce"} Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.382588 4919 generic.go:334] "Generic (PLEG): container finished" podID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerID="c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213" exitCode=0 Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.382622 4919 generic.go:334] "Generic (PLEG): container finished" podID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerID="4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48" exitCode=143 Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.382661 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45cb0e95-6f81-41a4-ae3a-a0e499ba1973","Type":"ContainerDied","Data":"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213"} Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.382689 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45cb0e95-6f81-41a4-ae3a-a0e499ba1973","Type":"ContainerDied","Data":"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48"} Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.382699 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45cb0e95-6f81-41a4-ae3a-a0e499ba1973","Type":"ContainerDied","Data":"38a21be2ddb15576d8f97a4464784b69b0f4bfba9238f897215fb817ddd9d50e"} Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.382713 4919 scope.go:117] "RemoveContainer" containerID="c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.382854 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.390555 4919 generic.go:334] "Generic (PLEG): container finished" podID="64c699c8-f2b1-421d-b617-ec6b7b7c806b" containerID="55b9f720f76d4e27cdd8fce568c10b873fab76107f554ff240b1936a023cca7d" exitCode=0 Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.390604 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64c699c8-f2b1-421d-b617-ec6b7b7c806b","Type":"ContainerDied","Data":"55b9f720f76d4e27cdd8fce568c10b873fab76107f554ff240b1936a023cca7d"} Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.413674 4919 scope.go:117] "RemoveContainer" containerID="4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.438701 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.447267 4919 scope.go:117] "RemoveContainer" containerID="c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213" Mar 10 22:14:57 crc kubenswrapper[4919]: E0310 22:14:57.447784 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213\": container with ID starting with c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213 not found: ID does not exist" containerID="c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.447825 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213"} err="failed to get container status \"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213\": rpc error: code = NotFound desc = could not find container \"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213\": container with ID starting with c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213 not found: ID does not exist" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.447848 4919 scope.go:117] "RemoveContainer" containerID="4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48" Mar 10 22:14:57 crc kubenswrapper[4919]: E0310 22:14:57.448110 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48\": container with ID starting with 4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48 not found: ID does not exist" containerID="4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.448143 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48"} err="failed to get container status \"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48\": rpc error: code = NotFound desc = could not find container \"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48\": container with ID starting with 4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48 not found: ID does not exist" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.448160 4919 scope.go:117] "RemoveContainer" containerID="c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.448355 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213"} err="failed to get container status \"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213\": rpc error: code = NotFound desc = could not find container \"c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213\": container with ID starting with c5a3f22ca744df1885bd34ee7c6f2cac53d401216e9e850512a048e2e94b6213 not found: ID does not exist" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.448376 4919 scope.go:117] "RemoveContainer" containerID="4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.450645 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48"} err="failed to get container status \"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48\": rpc error: code = NotFound desc = could not find container \"4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48\": container with ID starting with 4ffce0327144e9e04de688766d0cea8ab705d60996201f837692241fce940e48 not found: ID does not exist" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.451340 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468049 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:57 crc kubenswrapper[4919]: E0310 22:14:57.468580 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerName="nova-api-api" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468597 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerName="nova-api-api" Mar 10 22:14:57 crc kubenswrapper[4919]: E0310 22:14:57.468622 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerName="init" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468629 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerName="init" Mar 10 22:14:57 crc kubenswrapper[4919]: E0310 22:14:57.468639 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerName="nova-api-log" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468648 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerName="nova-api-log" Mar 10 22:14:57 crc kubenswrapper[4919]: E0310 22:14:57.468674 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da154ef-0415-4972-8282-cf5161c8fa71" containerName="nova-manage" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468683 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da154ef-0415-4972-8282-cf5161c8fa71" containerName="nova-manage" Mar 10 22:14:57 crc kubenswrapper[4919]: E0310 22:14:57.468708 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerName="dnsmasq-dns" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468715 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerName="dnsmasq-dns" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468924 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerName="nova-api-log" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468939 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="c696db8f-44d0-42ad-aa56-5e889eef767f" containerName="dnsmasq-dns" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468955 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" containerName="nova-api-api" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.468966 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da154ef-0415-4972-8282-cf5161c8fa71" containerName="nova-manage" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.470136 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.475569 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.475714 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.475782 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.479125 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.504450 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45cb0e95-6f81-41a4-ae3a-a0e499ba1973" path="/var/lib/kubelet/pods/45cb0e95-6f81-41a4-ae3a-a0e499ba1973/volumes" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.579222 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xzsn\" (UniqueName: \"kubernetes.io/projected/515105ef-e538-4276-b682-7e05881dc7e8-kube-api-access-6xzsn\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.579299 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-config-data\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.579382 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.579462 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.579500 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-public-tls-certs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.579552 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515105ef-e538-4276-b682-7e05881dc7e8-logs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.681490 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-config-data\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.681569 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.681624 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.681643 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-public-tls-certs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.681680 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515105ef-e538-4276-b682-7e05881dc7e8-logs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.681739 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xzsn\" (UniqueName: \"kubernetes.io/projected/515105ef-e538-4276-b682-7e05881dc7e8-kube-api-access-6xzsn\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.682373 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515105ef-e538-4276-b682-7e05881dc7e8-logs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.687457 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.687933 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.689153 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-config-data\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.689543 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-public-tls-certs\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.698925 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xzsn\" (UniqueName: \"kubernetes.io/projected/515105ef-e538-4276-b682-7e05881dc7e8-kube-api-access-6xzsn\") pod \"nova-api-0\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.793871 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.801708 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.885635 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qvwd\" (UniqueName: \"kubernetes.io/projected/64c699c8-f2b1-421d-b617-ec6b7b7c806b-kube-api-access-7qvwd\") pod \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.885716 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-config-data\") pod \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.885735 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-combined-ca-bundle\") pod \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\" (UID: \"64c699c8-f2b1-421d-b617-ec6b7b7c806b\") " Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.890465 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c699c8-f2b1-421d-b617-ec6b7b7c806b-kube-api-access-7qvwd" (OuterVolumeSpecName: "kube-api-access-7qvwd") pod "64c699c8-f2b1-421d-b617-ec6b7b7c806b" (UID: "64c699c8-f2b1-421d-b617-ec6b7b7c806b"). InnerVolumeSpecName "kube-api-access-7qvwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.947646 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64c699c8-f2b1-421d-b617-ec6b7b7c806b" (UID: "64c699c8-f2b1-421d-b617-ec6b7b7c806b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.957945 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-config-data" (OuterVolumeSpecName: "config-data") pod "64c699c8-f2b1-421d-b617-ec6b7b7c806b" (UID: "64c699c8-f2b1-421d-b617-ec6b7b7c806b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.987962 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qvwd\" (UniqueName: \"kubernetes.io/projected/64c699c8-f2b1-421d-b617-ec6b7b7c806b-kube-api-access-7qvwd\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.987987 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:57 crc kubenswrapper[4919]: I0310 22:14:57.987999 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c699c8-f2b1-421d-b617-ec6b7b7c806b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.273663 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:14:58 crc kubenswrapper[4919]: W0310 22:14:58.279792 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515105ef_e538_4276_b682_7e05881dc7e8.slice/crio-8ae11bf041c61f12bbef2122a68135eae2ca34470bbe4685da29db5222a51146 WatchSource:0}: Error finding container 8ae11bf041c61f12bbef2122a68135eae2ca34470bbe4685da29db5222a51146: Status 404 returned error can't find the container with id 8ae11bf041c61f12bbef2122a68135eae2ca34470bbe4685da29db5222a51146 Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.404521 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64c699c8-f2b1-421d-b617-ec6b7b7c806b","Type":"ContainerDied","Data":"e8370c882bb1998db0d8ae6781f21009f518de0cf12fdec144409ca7586ec4ee"} Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.404571 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.404584 4919 scope.go:117] "RemoveContainer" containerID="55b9f720f76d4e27cdd8fce568c10b873fab76107f554ff240b1936a023cca7d" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.405622 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"515105ef-e538-4276-b682-7e05881dc7e8","Type":"ContainerStarted","Data":"8ae11bf041c61f12bbef2122a68135eae2ca34470bbe4685da29db5222a51146"} Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.444480 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.452931 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.475715 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:58 crc kubenswrapper[4919]: E0310 22:14:58.487769 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c699c8-f2b1-421d-b617-ec6b7b7c806b" containerName="nova-scheduler-scheduler" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.487807 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c699c8-f2b1-421d-b617-ec6b7b7c806b" containerName="nova-scheduler-scheduler" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.488115 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c699c8-f2b1-421d-b617-ec6b7b7c806b" containerName="nova-scheduler-scheduler" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.489227 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.491414 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.491493 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.601561 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.604077 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm9xr\" (UniqueName: \"kubernetes.io/projected/9700fb27-6a74-428d-a2e6-71c237b3e054-kube-api-access-lm9xr\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.604220 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-config-data\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.706478 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm9xr\" (UniqueName: \"kubernetes.io/projected/9700fb27-6a74-428d-a2e6-71c237b3e054-kube-api-access-lm9xr\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.706769 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-config-data\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.706925 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.713677 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.716356 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-config-data\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.722152 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm9xr\" (UniqueName: \"kubernetes.io/projected/9700fb27-6a74-428d-a2e6-71c237b3e054-kube-api-access-lm9xr\") pod \"nova-scheduler-0\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " pod="openstack/nova-scheduler-0" Mar 10 22:14:58 crc kubenswrapper[4919]: I0310 22:14:58.812673 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:14:59 crc kubenswrapper[4919]: I0310 22:14:59.175497 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:14:59 crc kubenswrapper[4919]: I0310 22:14:59.175548 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:14:59 crc kubenswrapper[4919]: W0310 22:14:59.310984 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9700fb27_6a74_428d_a2e6_71c237b3e054.slice/crio-54964dbcb0afc9490d2734ef68d0cbd04f49fcea1dc31452b540f5a029cb55be WatchSource:0}: Error finding container 54964dbcb0afc9490d2734ef68d0cbd04f49fcea1dc31452b540f5a029cb55be: Status 404 returned error can't find the container with id 54964dbcb0afc9490d2734ef68d0cbd04f49fcea1dc31452b540f5a029cb55be Mar 10 22:14:59 crc kubenswrapper[4919]: I0310 22:14:59.313133 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:14:59 crc kubenswrapper[4919]: I0310 22:14:59.416406 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9700fb27-6a74-428d-a2e6-71c237b3e054","Type":"ContainerStarted","Data":"54964dbcb0afc9490d2734ef68d0cbd04f49fcea1dc31452b540f5a029cb55be"} Mar 10 22:14:59 crc kubenswrapper[4919]: I0310 22:14:59.418524 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"515105ef-e538-4276-b682-7e05881dc7e8","Type":"ContainerStarted","Data":"6998100dd91ae8a0c4934a4b8c43b07c72cd35d8be7e9f5c1635f9179079c2ed"} Mar 10 22:14:59 crc kubenswrapper[4919]: I0310 22:14:59.418558 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"515105ef-e538-4276-b682-7e05881dc7e8","Type":"ContainerStarted","Data":"9f98722ad0d1eaba724d3a1905d648a1f6aa5618c0b6cbe31210b8bf1f36f489"} Mar 10 22:14:59 crc kubenswrapper[4919]: I0310 22:14:59.450737 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.450716229 podStartE2EDuration="2.450716229s" podCreationTimestamp="2026-03-10 22:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:14:59.439843345 +0000 UTC m=+1486.681723973" watchObservedRunningTime="2026-03-10 22:14:59.450716229 +0000 UTC m=+1486.692596837" Mar 10 22:14:59 crc kubenswrapper[4919]: I0310 22:14:59.501268 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c699c8-f2b1-421d-b617-ec6b7b7c806b" path="/var/lib/kubelet/pods/64c699c8-f2b1-421d-b617-ec6b7b7c806b/volumes" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.144521 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg"] Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.146422 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.149535 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.149713 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.159316 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg"] Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.233154 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-secret-volume\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.233357 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htbww\" (UniqueName: \"kubernetes.io/projected/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-kube-api-access-htbww\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.233417 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-config-volume\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.282303 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.335137 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-config-data\") pod \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.335352 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-nova-metadata-tls-certs\") pod \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.335423 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqjd4\" (UniqueName: \"kubernetes.io/projected/fb1090fb-ea87-4cce-a1fb-05cb31306efa-kube-api-access-cqjd4\") pod \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.335472 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-combined-ca-bundle\") pod \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.335582 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1090fb-ea87-4cce-a1fb-05cb31306efa-logs\") pod \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\" (UID: \"fb1090fb-ea87-4cce-a1fb-05cb31306efa\") " Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.335975 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-secret-volume\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.336078 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htbww\" (UniqueName: \"kubernetes.io/projected/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-kube-api-access-htbww\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.336111 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-config-volume\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.337006 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-config-volume\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.337328 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb1090fb-ea87-4cce-a1fb-05cb31306efa-logs" (OuterVolumeSpecName: "logs") pod "fb1090fb-ea87-4cce-a1fb-05cb31306efa" (UID: "fb1090fb-ea87-4cce-a1fb-05cb31306efa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.343371 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-secret-volume\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.358655 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1090fb-ea87-4cce-a1fb-05cb31306efa-kube-api-access-cqjd4" (OuterVolumeSpecName: "kube-api-access-cqjd4") pod "fb1090fb-ea87-4cce-a1fb-05cb31306efa" (UID: "fb1090fb-ea87-4cce-a1fb-05cb31306efa"). InnerVolumeSpecName "kube-api-access-cqjd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.367199 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htbww\" (UniqueName: \"kubernetes.io/projected/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-kube-api-access-htbww\") pod \"collect-profiles-29553015-mmvrg\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.381311 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb1090fb-ea87-4cce-a1fb-05cb31306efa" (UID: "fb1090fb-ea87-4cce-a1fb-05cb31306efa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.392538 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-config-data" (OuterVolumeSpecName: "config-data") pod "fb1090fb-ea87-4cce-a1fb-05cb31306efa" (UID: "fb1090fb-ea87-4cce-a1fb-05cb31306efa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.420538 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fb1090fb-ea87-4cce-a1fb-05cb31306efa" (UID: "fb1090fb-ea87-4cce-a1fb-05cb31306efa"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.431128 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9700fb27-6a74-428d-a2e6-71c237b3e054","Type":"ContainerStarted","Data":"8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5"} Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.433161 4919 generic.go:334] "Generic (PLEG): container finished" podID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerID="6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98" exitCode=0 Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.433221 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.433273 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb1090fb-ea87-4cce-a1fb-05cb31306efa","Type":"ContainerDied","Data":"6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98"} Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.433295 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb1090fb-ea87-4cce-a1fb-05cb31306efa","Type":"ContainerDied","Data":"ffb92b6294a3f113e77f8964fbd84505efd5118e48c503577ec14800906e75ad"} Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.433313 4919 scope.go:117] "RemoveContainer" containerID="6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.437399 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb1090fb-ea87-4cce-a1fb-05cb31306efa-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.437420 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.437430 4919 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.437441 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqjd4\" (UniqueName: \"kubernetes.io/projected/fb1090fb-ea87-4cce-a1fb-05cb31306efa-kube-api-access-cqjd4\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.437451 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1090fb-ea87-4cce-a1fb-05cb31306efa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.450911 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.450888127 podStartE2EDuration="2.450888127s" podCreationTimestamp="2026-03-10 22:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:15:00.447311191 +0000 UTC m=+1487.689191799" watchObservedRunningTime="2026-03-10 22:15:00.450888127 +0000 UTC m=+1487.692768735" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.474091 4919 scope.go:117] "RemoveContainer" containerID="ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.485056 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.496494 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.508551 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:15:00 crc kubenswrapper[4919]: E0310 22:15:00.509026 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-metadata" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.509042 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-metadata" Mar 10 22:15:00 crc kubenswrapper[4919]: E0310 22:15:00.509070 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-log" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.509080 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-log" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.509238 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-metadata" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.509258 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" containerName="nova-metadata-log" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.510196 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.510653 4919 scope.go:117] "RemoveContainer" containerID="6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98" Mar 10 22:15:00 crc kubenswrapper[4919]: E0310 22:15:00.511117 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98\": container with ID starting with 6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98 not found: ID does not exist" containerID="6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.511146 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98"} err="failed to get container status \"6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98\": rpc error: code = NotFound desc = could not find container \"6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98\": container with ID starting with 6136b4ecac1972d46f866529492a1786be912fb3fc39d166d85249af6e669d98 not found: ID does not exist" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.511167 4919 scope.go:117] "RemoveContainer" containerID="ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce" Mar 10 22:15:00 crc kubenswrapper[4919]: E0310 22:15:00.511639 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce\": container with ID starting with ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce not found: ID does not exist" containerID="ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.511673 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce"} err="failed to get container status \"ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce\": rpc error: code = NotFound desc = could not find container \"ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce\": container with ID starting with ac33dde6fb8f040b19539d6229a740b8950b17142b3fefd31965a23cb99759ce not found: ID does not exist" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.512329 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.512851 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.520293 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.572913 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.641027 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.641135 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.641165 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czfj2\" (UniqueName: \"kubernetes.io/projected/81489e39-0246-4065-8835-31b1e5da8431-kube-api-access-czfj2\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.641246 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-config-data\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.641295 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81489e39-0246-4065-8835-31b1e5da8431-logs\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.742644 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.742698 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czfj2\" (UniqueName: \"kubernetes.io/projected/81489e39-0246-4065-8835-31b1e5da8431-kube-api-access-czfj2\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.742808 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-config-data\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.742860 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81489e39-0246-4065-8835-31b1e5da8431-logs\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.742907 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.753489 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81489e39-0246-4065-8835-31b1e5da8431-logs\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.757670 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.759247 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-config-data\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.766403 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.767256 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czfj2\" (UniqueName: \"kubernetes.io/projected/81489e39-0246-4065-8835-31b1e5da8431-kube-api-access-czfj2\") pod \"nova-metadata-0\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " pod="openstack/nova-metadata-0" Mar 10 22:15:00 crc kubenswrapper[4919]: I0310 22:15:00.836275 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:15:01 crc kubenswrapper[4919]: I0310 22:15:01.039999 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg"] Mar 10 22:15:01 crc kubenswrapper[4919]: W0310 22:15:01.249690 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81489e39_0246_4065_8835_31b1e5da8431.slice/crio-24bef484547c2d8e19b8a71bb9c628aac6dba8274e3bc46da702bf218bca2791 WatchSource:0}: Error finding container 24bef484547c2d8e19b8a71bb9c628aac6dba8274e3bc46da702bf218bca2791: Status 404 returned error can't find the container with id 24bef484547c2d8e19b8a71bb9c628aac6dba8274e3bc46da702bf218bca2791 Mar 10 22:15:01 crc kubenswrapper[4919]: I0310 22:15:01.251289 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:15:01 crc kubenswrapper[4919]: I0310 22:15:01.445235 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81489e39-0246-4065-8835-31b1e5da8431","Type":"ContainerStarted","Data":"77810b20846ff06e6e2286529394046c226fcdd21d8e8615e097fa35b3579513"} Mar 10 22:15:01 crc kubenswrapper[4919]: I0310 22:15:01.445724 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81489e39-0246-4065-8835-31b1e5da8431","Type":"ContainerStarted","Data":"24bef484547c2d8e19b8a71bb9c628aac6dba8274e3bc46da702bf218bca2791"} Mar 10 22:15:01 crc kubenswrapper[4919]: I0310 22:15:01.448381 4919 generic.go:334] "Generic (PLEG): container finished" podID="7179315d-d730-4a2f-8ed2-7b06ff2fd2ff" containerID="3b8bb041798f51d982f4bcd80e3c1947b5868b1f2dc277d0ff8d6ecfa13d1c9c" exitCode=0 Mar 10 22:15:01 crc kubenswrapper[4919]: I0310 22:15:01.448474 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" event={"ID":"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff","Type":"ContainerDied","Data":"3b8bb041798f51d982f4bcd80e3c1947b5868b1f2dc277d0ff8d6ecfa13d1c9c"} Mar 10 22:15:01 crc kubenswrapper[4919]: I0310 22:15:01.448513 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" event={"ID":"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff","Type":"ContainerStarted","Data":"b7bcc678db776f2bd8efcb49884d987023565c520a3913c764bbadf2ba7f67a7"} Mar 10 22:15:01 crc kubenswrapper[4919]: I0310 22:15:01.494324 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1090fb-ea87-4cce-a1fb-05cb31306efa" path="/var/lib/kubelet/pods/fb1090fb-ea87-4cce-a1fb-05cb31306efa/volumes" Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.477741 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81489e39-0246-4065-8835-31b1e5da8431","Type":"ContainerStarted","Data":"0d25c07eec1b4976670c75603fd5da5476a97bed8734c71857c7dda9c1fa75bb"} Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.503083 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.502885285 podStartE2EDuration="2.502885285s" podCreationTimestamp="2026-03-10 22:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 22:15:02.499984966 +0000 UTC m=+1489.741865594" watchObservedRunningTime="2026-03-10 22:15:02.502885285 +0000 UTC m=+1489.744765903" Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.845770 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.985520 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-config-volume\") pod \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.985700 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-secret-volume\") pod \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.985747 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htbww\" (UniqueName: \"kubernetes.io/projected/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-kube-api-access-htbww\") pod \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\" (UID: \"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff\") " Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.986463 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "7179315d-d730-4a2f-8ed2-7b06ff2fd2ff" (UID: "7179315d-d730-4a2f-8ed2-7b06ff2fd2ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.990720 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7179315d-d730-4a2f-8ed2-7b06ff2fd2ff" (UID: "7179315d-d730-4a2f-8ed2-7b06ff2fd2ff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:02 crc kubenswrapper[4919]: I0310 22:15:02.990723 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-kube-api-access-htbww" (OuterVolumeSpecName: "kube-api-access-htbww") pod "7179315d-d730-4a2f-8ed2-7b06ff2fd2ff" (UID: "7179315d-d730-4a2f-8ed2-7b06ff2fd2ff"). InnerVolumeSpecName "kube-api-access-htbww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:03 crc kubenswrapper[4919]: I0310 22:15:03.088076 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:03 crc kubenswrapper[4919]: I0310 22:15:03.088119 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htbww\" (UniqueName: \"kubernetes.io/projected/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-kube-api-access-htbww\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:03 crc kubenswrapper[4919]: I0310 22:15:03.088131 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:03 crc kubenswrapper[4919]: I0310 22:15:03.499774 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" Mar 10 22:15:03 crc kubenswrapper[4919]: I0310 22:15:03.504719 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg" event={"ID":"7179315d-d730-4a2f-8ed2-7b06ff2fd2ff","Type":"ContainerDied","Data":"b7bcc678db776f2bd8efcb49884d987023565c520a3913c764bbadf2ba7f67a7"} Mar 10 22:15:03 crc kubenswrapper[4919]: I0310 22:15:03.504747 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7bcc678db776f2bd8efcb49884d987023565c520a3913c764bbadf2ba7f67a7" Mar 10 22:15:03 crc kubenswrapper[4919]: I0310 22:15:03.814146 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 22:15:05 crc kubenswrapper[4919]: I0310 22:15:05.836579 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 22:15:05 crc kubenswrapper[4919]: I0310 22:15:05.836975 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.523850 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-86mcr"] Mar 10 22:15:07 crc kubenswrapper[4919]: E0310 22:15:07.524635 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7179315d-d730-4a2f-8ed2-7b06ff2fd2ff" containerName="collect-profiles" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.524660 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7179315d-d730-4a2f-8ed2-7b06ff2fd2ff" containerName="collect-profiles" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.524997 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7179315d-d730-4a2f-8ed2-7b06ff2fd2ff" containerName="collect-profiles" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.528015 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.543478 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86mcr"] Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.673237 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf7pd\" (UniqueName: \"kubernetes.io/projected/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-kube-api-access-tf7pd\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.673291 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-utilities\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.673373 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-catalog-content\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.774621 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf7pd\" (UniqueName: \"kubernetes.io/projected/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-kube-api-access-tf7pd\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.774667 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-utilities\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.774706 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-catalog-content\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.775384 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-catalog-content\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.775819 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-utilities\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.794717 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf7pd\" (UniqueName: \"kubernetes.io/projected/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-kube-api-access-tf7pd\") pod \"redhat-operators-86mcr\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.804726 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.804819 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 22:15:07 crc kubenswrapper[4919]: I0310 22:15:07.860091 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:08 crc kubenswrapper[4919]: I0310 22:15:08.329582 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86mcr"] Mar 10 22:15:08 crc kubenswrapper[4919]: I0310 22:15:08.547693 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86mcr" event={"ID":"63307c10-4c05-4bc7-8db7-fbd6f51a8f37","Type":"ContainerStarted","Data":"e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8"} Mar 10 22:15:08 crc kubenswrapper[4919]: I0310 22:15:08.548823 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86mcr" event={"ID":"63307c10-4c05-4bc7-8db7-fbd6f51a8f37","Type":"ContainerStarted","Data":"b5c606ff5ca9e739e5852fc8ede8a9d57fc6ed729dd426d9c6819d55b0d874f3"} Mar 10 22:15:08 crc kubenswrapper[4919]: I0310 22:15:08.814106 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 22:15:08 crc kubenswrapper[4919]: I0310 22:15:08.818593 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 22:15:08 crc kubenswrapper[4919]: I0310 22:15:08.818904 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 22:15:08 crc kubenswrapper[4919]: I0310 22:15:08.859153 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 22:15:09 crc kubenswrapper[4919]: I0310 22:15:09.612889 4919 generic.go:334] "Generic (PLEG): container finished" podID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerID="e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8" exitCode=0 Mar 10 22:15:09 crc kubenswrapper[4919]: I0310 22:15:09.613597 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86mcr" event={"ID":"63307c10-4c05-4bc7-8db7-fbd6f51a8f37","Type":"ContainerDied","Data":"e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8"} Mar 10 22:15:09 crc kubenswrapper[4919]: I0310 22:15:09.663573 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 22:15:10 crc kubenswrapper[4919]: I0310 22:15:10.624864 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86mcr" event={"ID":"63307c10-4c05-4bc7-8db7-fbd6f51a8f37","Type":"ContainerStarted","Data":"ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8"} Mar 10 22:15:10 crc kubenswrapper[4919]: I0310 22:15:10.837590 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 22:15:10 crc kubenswrapper[4919]: I0310 22:15:10.837663 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 22:15:11 crc kubenswrapper[4919]: I0310 22:15:11.853610 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 22:15:11 crc kubenswrapper[4919]: I0310 22:15:11.853932 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 22:15:12 crc kubenswrapper[4919]: I0310 22:15:12.645262 4919 generic.go:334] "Generic (PLEG): container finished" podID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerID="ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8" exitCode=0 Mar 10 22:15:12 crc kubenswrapper[4919]: I0310 22:15:12.645323 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86mcr" event={"ID":"63307c10-4c05-4bc7-8db7-fbd6f51a8f37","Type":"ContainerDied","Data":"ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8"} Mar 10 22:15:13 crc kubenswrapper[4919]: I0310 22:15:13.668085 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86mcr" event={"ID":"63307c10-4c05-4bc7-8db7-fbd6f51a8f37","Type":"ContainerStarted","Data":"14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7"} Mar 10 22:15:13 crc kubenswrapper[4919]: I0310 22:15:13.693254 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-86mcr" podStartSLOduration=3.006164757 podStartE2EDuration="6.693236163s" podCreationTimestamp="2026-03-10 22:15:07 +0000 UTC" firstStartedPulling="2026-03-10 22:15:09.617844966 +0000 UTC m=+1496.859725574" lastFinishedPulling="2026-03-10 22:15:13.304916372 +0000 UTC m=+1500.546796980" observedRunningTime="2026-03-10 22:15:13.686910472 +0000 UTC m=+1500.928791080" watchObservedRunningTime="2026-03-10 22:15:13.693236163 +0000 UTC m=+1500.935116771" Mar 10 22:15:15 crc kubenswrapper[4919]: I0310 22:15:15.767944 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 22:15:17 crc kubenswrapper[4919]: I0310 22:15:17.808109 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 22:15:17 crc kubenswrapper[4919]: I0310 22:15:17.808842 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 22:15:17 crc kubenswrapper[4919]: I0310 22:15:17.814220 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 22:15:17 crc kubenswrapper[4919]: I0310 22:15:17.814965 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 22:15:17 crc kubenswrapper[4919]: I0310 22:15:17.860920 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:17 crc kubenswrapper[4919]: I0310 22:15:17.860970 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:18 crc kubenswrapper[4919]: I0310 22:15:18.710848 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 22:15:18 crc kubenswrapper[4919]: I0310 22:15:18.715692 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 22:15:18 crc kubenswrapper[4919]: I0310 22:15:18.911362 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-86mcr" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="registry-server" probeResult="failure" output=< Mar 10 22:15:18 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 22:15:18 crc kubenswrapper[4919]: > Mar 10 22:15:20 crc kubenswrapper[4919]: I0310 22:15:20.847337 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 22:15:20 crc kubenswrapper[4919]: I0310 22:15:20.852957 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 22:15:20 crc kubenswrapper[4919]: I0310 22:15:20.856483 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 22:15:21 crc kubenswrapper[4919]: I0310 22:15:21.738104 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 22:15:27 crc kubenswrapper[4919]: I0310 22:15:27.914124 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:27 crc kubenswrapper[4919]: I0310 22:15:27.963162 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:28 crc kubenswrapper[4919]: I0310 22:15:28.151858 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86mcr"] Mar 10 22:15:29 crc kubenswrapper[4919]: I0310 22:15:29.175356 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:15:29 crc kubenswrapper[4919]: I0310 22:15:29.175462 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:15:29 crc kubenswrapper[4919]: I0310 22:15:29.825550 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-86mcr" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="registry-server" containerID="cri-o://14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7" gracePeriod=2 Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.329182 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.479050 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf7pd\" (UniqueName: \"kubernetes.io/projected/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-kube-api-access-tf7pd\") pod \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.480004 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-catalog-content\") pod \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.480091 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-utilities\") pod \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\" (UID: \"63307c10-4c05-4bc7-8db7-fbd6f51a8f37\") " Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.480805 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-utilities" (OuterVolumeSpecName: "utilities") pod "63307c10-4c05-4bc7-8db7-fbd6f51a8f37" (UID: "63307c10-4c05-4bc7-8db7-fbd6f51a8f37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.484213 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-kube-api-access-tf7pd" (OuterVolumeSpecName: "kube-api-access-tf7pd") pod "63307c10-4c05-4bc7-8db7-fbd6f51a8f37" (UID: "63307c10-4c05-4bc7-8db7-fbd6f51a8f37"). InnerVolumeSpecName "kube-api-access-tf7pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.583704 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.583733 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf7pd\" (UniqueName: \"kubernetes.io/projected/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-kube-api-access-tf7pd\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.602340 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63307c10-4c05-4bc7-8db7-fbd6f51a8f37" (UID: "63307c10-4c05-4bc7-8db7-fbd6f51a8f37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.685039 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63307c10-4c05-4bc7-8db7-fbd6f51a8f37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.837925 4919 generic.go:334] "Generic (PLEG): container finished" podID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerID="14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7" exitCode=0 Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.837985 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86mcr" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.838016 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86mcr" event={"ID":"63307c10-4c05-4bc7-8db7-fbd6f51a8f37","Type":"ContainerDied","Data":"14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7"} Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.838064 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86mcr" event={"ID":"63307c10-4c05-4bc7-8db7-fbd6f51a8f37","Type":"ContainerDied","Data":"b5c606ff5ca9e739e5852fc8ede8a9d57fc6ed729dd426d9c6819d55b0d874f3"} Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.838086 4919 scope.go:117] "RemoveContainer" containerID="14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.860096 4919 scope.go:117] "RemoveContainer" containerID="ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.877706 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86mcr"] Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.885124 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-86mcr"] Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.896743 4919 scope.go:117] "RemoveContainer" containerID="e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.941681 4919 scope.go:117] "RemoveContainer" containerID="14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7" Mar 10 22:15:30 crc kubenswrapper[4919]: E0310 22:15:30.942773 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7\": container with ID starting with 14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7 not found: ID does not exist" containerID="14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.942836 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7"} err="failed to get container status \"14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7\": rpc error: code = NotFound desc = could not find container \"14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7\": container with ID starting with 14e547a91f6c17ebfbcfbea5c1bcd030d1bb1a58a87b7aed45b42aac44b056e7 not found: ID does not exist" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.942882 4919 scope.go:117] "RemoveContainer" containerID="ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8" Mar 10 22:15:30 crc kubenswrapper[4919]: E0310 22:15:30.943347 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8\": container with ID starting with ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8 not found: ID does not exist" containerID="ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.943475 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8"} err="failed to get container status \"ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8\": rpc error: code = NotFound desc = could not find container \"ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8\": container with ID starting with ceb6066414da4868afecf730b3a5210fca32f0b6d27f64f35a9544410304c2c8 not found: ID does not exist" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.943524 4919 scope.go:117] "RemoveContainer" containerID="e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8" Mar 10 22:15:30 crc kubenswrapper[4919]: E0310 22:15:30.943952 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8\": container with ID starting with e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8 not found: ID does not exist" containerID="e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8" Mar 10 22:15:30 crc kubenswrapper[4919]: I0310 22:15:30.943989 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8"} err="failed to get container status \"e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8\": rpc error: code = NotFound desc = could not find container \"e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8\": container with ID starting with e863a8bba3e233f423b03dbbe3000f36aaa875042f656eea015e9df4f2d3c2f8 not found: ID does not exist" Mar 10 22:15:31 crc kubenswrapper[4919]: I0310 22:15:31.492688 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" path="/var/lib/kubelet/pods/63307c10-4c05-4bc7-8db7-fbd6f51a8f37/volumes" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.024951 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xlzz4"] Mar 10 22:15:41 crc kubenswrapper[4919]: E0310 22:15:41.026035 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="extract-utilities" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.026054 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="extract-utilities" Mar 10 22:15:41 crc kubenswrapper[4919]: E0310 22:15:41.026073 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="extract-content" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.026083 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="extract-content" Mar 10 22:15:41 crc kubenswrapper[4919]: E0310 22:15:41.026097 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="registry-server" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.026105 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="registry-server" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.026330 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="63307c10-4c05-4bc7-8db7-fbd6f51a8f37" containerName="registry-server" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.027105 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.029708 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.115226 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xlzz4"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.148992 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dn8pf"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.191711 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vf95\" (UniqueName: \"kubernetes.io/projected/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-kube-api-access-2vf95\") pod \"root-account-create-update-xlzz4\" (UID: \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\") " pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.191857 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts\") pod \"root-account-create-update-xlzz4\" (UID: \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\") " pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.222338 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dn8pf"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.269874 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.270417 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="6bff1404-f9b1-48f8-b093-95c3bb206c6a" containerName="openstackclient" containerID="cri-o://9356369f3992f46c3635f399cef0e5e3d0db351590dc5b463547b642e4ea9e30" gracePeriod=2 Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.294054 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vf95\" (UniqueName: \"kubernetes.io/projected/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-kube-api-access-2vf95\") pod \"root-account-create-update-xlzz4\" (UID: \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\") " pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.294260 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts\") pod \"root-account-create-update-xlzz4\" (UID: \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\") " pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.295165 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts\") pod \"root-account-create-update-xlzz4\" (UID: \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\") " pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.314476 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.341827 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5e3c-account-create-update-m2swx"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.348001 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5e3c-account-create-update-m2swx"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.356092 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vf95\" (UniqueName: \"kubernetes.io/projected/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-kube-api-access-2vf95\") pod \"root-account-create-update-xlzz4\" (UID: \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\") " pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.409437 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8e40-account-create-update-c8zqv"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.417503 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8e40-account-create-update-c8zqv"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.431021 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8e40-account-create-update-kkddk"] Mar 10 22:15:41 crc kubenswrapper[4919]: E0310 22:15:41.434749 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bff1404-f9b1-48f8-b093-95c3bb206c6a" containerName="openstackclient" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.434787 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bff1404-f9b1-48f8-b093-95c3bb206c6a" containerName="openstackclient" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.435107 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bff1404-f9b1-48f8-b093-95c3bb206c6a" containerName="openstackclient" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.435797 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.447424 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.469795 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8e40-account-create-update-kkddk"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.495871 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5759d1d2-d713-4e24-a2fb-c1c6804a4c39" path="/var/lib/kubelet/pods/5759d1d2-d713-4e24-a2fb-c1c6804a4c39/volumes" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.496611 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6399309a-f5f6-4a74-ac8c-8e806984cee9" path="/var/lib/kubelet/pods/6399309a-f5f6-4a74-ac8c-8e806984cee9/volumes" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.497204 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d920fbb-265a-4fbd-8a5b-02a8fdcf216f" path="/var/lib/kubelet/pods/9d920fbb-265a-4fbd-8a5b-02a8fdcf216f/volumes" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.576912 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.578746 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="openstack-network-exporter" containerID="cri-o://1cbaf6bf606b116c1dcc6c8ceb022ba5bf13acdeef539d78bf1c08e4dba722aa" gracePeriod=300 Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.607443 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whlhh\" (UniqueName: \"kubernetes.io/projected/ec49f65c-e8af-44a1-b464-af2a86b299fc-kube-api-access-whlhh\") pod \"barbican-8e40-account-create-update-kkddk\" (UID: \"ec49f65c-e8af-44a1-b464-af2a86b299fc\") " pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.607579 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec49f65c-e8af-44a1-b464-af2a86b299fc-operator-scripts\") pod \"barbican-8e40-account-create-update-kkddk\" (UID: \"ec49f65c-e8af-44a1-b464-af2a86b299fc\") " pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:41 crc kubenswrapper[4919]: E0310 22:15:41.608267 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Mar 10 22:15:41 crc kubenswrapper[4919]: E0310 22:15:41.608331 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:42.108313388 +0000 UTC m=+1529.350193996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-scripts" not found Mar 10 22:15:41 crc kubenswrapper[4919]: E0310 22:15:41.608539 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Mar 10 22:15:41 crc kubenswrapper[4919]: E0310 22:15:41.608571 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:42.108564515 +0000 UTC m=+1529.350445123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-config" not found Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.645569 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.648801 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.674163 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-45c6-account-create-update-zczhm"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.675438 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.680441 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.712403 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whlhh\" (UniqueName: \"kubernetes.io/projected/ec49f65c-e8af-44a1-b464-af2a86b299fc-kube-api-access-whlhh\") pod \"barbican-8e40-account-create-update-kkddk\" (UID: \"ec49f65c-e8af-44a1-b464-af2a86b299fc\") " pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.712537 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec49f65c-e8af-44a1-b464-af2a86b299fc-operator-scripts\") pod \"barbican-8e40-account-create-update-kkddk\" (UID: \"ec49f65c-e8af-44a1-b464-af2a86b299fc\") " pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.713219 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec49f65c-e8af-44a1-b464-af2a86b299fc-operator-scripts\") pod \"barbican-8e40-account-create-update-kkddk\" (UID: \"ec49f65c-e8af-44a1-b464-af2a86b299fc\") " pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.719678 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9e8c-account-create-update-6q4j2"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.720809 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.733069 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.740821 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-45c6-account-create-update-zczhm"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.756453 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-105f-account-create-update-g2tcv"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.769628 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-105f-account-create-update-g2tcv"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.774339 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whlhh\" (UniqueName: \"kubernetes.io/projected/ec49f65c-e8af-44a1-b464-af2a86b299fc-kube-api-access-whlhh\") pod \"barbican-8e40-account-create-update-kkddk\" (UID: \"ec49f65c-e8af-44a1-b464-af2a86b299fc\") " pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.787883 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.787969 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9e8c-account-create-update-6q4j2"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.813564 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-operator-scripts\") pod \"neutron-45c6-account-create-update-zczhm\" (UID: \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\") " pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.813956 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01f4397-9fee-4ff4-af76-ed0b37f04b28-operator-scripts\") pod \"glance-9e8c-account-create-update-6q4j2\" (UID: \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\") " pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.814083 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc4lr\" (UniqueName: \"kubernetes.io/projected/a01f4397-9fee-4ff4-af76-ed0b37f04b28-kube-api-access-tc4lr\") pod \"glance-9e8c-account-create-update-6q4j2\" (UID: \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\") " pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.814218 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhrj\" (UniqueName: \"kubernetes.io/projected/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-kube-api-access-kdhrj\") pod \"neutron-45c6-account-create-update-zczhm\" (UID: \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\") " pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.820459 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.820955 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="openstack-network-exporter" containerID="cri-o://fa482cd6ab218644c93efa05d2493a56acc39a947142b9aa93cab897fbd8faee" gracePeriod=300 Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.865205 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cdcf-account-create-update-qh8rx"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.866431 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.876676 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.885151 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-qh8rx"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.908160 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.919814 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc4lr\" (UniqueName: \"kubernetes.io/projected/a01f4397-9fee-4ff4-af76-ed0b37f04b28-kube-api-access-tc4lr\") pod \"glance-9e8c-account-create-update-6q4j2\" (UID: \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\") " pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.919909 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhrj\" (UniqueName: \"kubernetes.io/projected/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-kube-api-access-kdhrj\") pod \"neutron-45c6-account-create-update-zczhm\" (UID: \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\") " pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.919948 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-operator-scripts\") pod \"neutron-45c6-account-create-update-zczhm\" (UID: \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\") " pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.920023 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01f4397-9fee-4ff4-af76-ed0b37f04b28-operator-scripts\") pod \"glance-9e8c-account-create-update-6q4j2\" (UID: \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\") " pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.920784 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01f4397-9fee-4ff4-af76-ed0b37f04b28-operator-scripts\") pod \"glance-9e8c-account-create-update-6q4j2\" (UID: \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\") " pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.921179 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-operator-scripts\") pod \"neutron-45c6-account-create-update-zczhm\" (UID: \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\") " pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.950445 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-8l4wr"] Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.951741 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:41 crc kubenswrapper[4919]: I0310 22:15:41.962211 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.006619 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-8l4wr"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.015794 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="ovsdbserver-nb" containerID="cri-o://7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44" gracePeriod=300 Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.016851 4919 generic.go:334] "Generic (PLEG): container finished" podID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerID="1cbaf6bf606b116c1dcc6c8ceb022ba5bf13acdeef539d78bf1c08e4dba722aa" exitCode=2 Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.017008 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940","Type":"ContainerDied","Data":"1cbaf6bf606b116c1dcc6c8ceb022ba5bf13acdeef539d78bf1c08e4dba722aa"} Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.017153 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="ovn-northd" containerID="cri-o://b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a" gracePeriod=30 Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.017275 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="openstack-network-exporter" containerID="cri-o://95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b" gracePeriod=30 Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.022959 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76m5t\" (UniqueName: \"kubernetes.io/projected/9d5850b9-d946-4b1a-9171-718243c78596-kube-api-access-76m5t\") pod \"nova-api-cdcf-account-create-update-qh8rx\" (UID: \"9d5850b9-d946-4b1a-9171-718243c78596\") " pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.023085 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5850b9-d946-4b1a-9171-718243c78596-operator-scripts\") pod \"nova-api-cdcf-account-create-update-qh8rx\" (UID: \"9d5850b9-d946-4b1a-9171-718243c78596\") " pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.023162 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwxxg\" (UniqueName: \"kubernetes.io/projected/a946243a-c6de-4499-9c3c-c11073d02f8e-kube-api-access-qwxxg\") pod \"nova-cell0-dff6-account-create-update-8l4wr\" (UID: \"a946243a-c6de-4499-9c3c-c11073d02f8e\") " pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.023211 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a946243a-c6de-4499-9c3c-c11073d02f8e-operator-scripts\") pod \"nova-cell0-dff6-account-create-update-8l4wr\" (UID: \"a946243a-c6de-4499-9c3c-c11073d02f8e\") " pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.031631 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-7bp9g"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.037890 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.057666 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.062250 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhrj\" (UniqueName: \"kubernetes.io/projected/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-kube-api-access-kdhrj\") pod \"neutron-45c6-account-create-update-zczhm\" (UID: \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\") " pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.075431 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc4lr\" (UniqueName: \"kubernetes.io/projected/a01f4397-9fee-4ff4-af76-ed0b37f04b28-kube-api-access-tc4lr\") pod \"glance-9e8c-account-create-update-6q4j2\" (UID: \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\") " pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.093678 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-45c6-account-create-update-cz7k8"] Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.099582 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.103542 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.125786 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5850b9-d946-4b1a-9171-718243c78596-operator-scripts\") pod \"nova-api-cdcf-account-create-update-qh8rx\" (UID: \"9d5850b9-d946-4b1a-9171-718243c78596\") " pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.125883 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwxxg\" (UniqueName: \"kubernetes.io/projected/a946243a-c6de-4499-9c3c-c11073d02f8e-kube-api-access-qwxxg\") pod \"nova-cell0-dff6-account-create-update-8l4wr\" (UID: \"a946243a-c6de-4499-9c3c-c11073d02f8e\") " pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.125930 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a946243a-c6de-4499-9c3c-c11073d02f8e-operator-scripts\") pod \"nova-cell0-dff6-account-create-update-8l4wr\" (UID: \"a946243a-c6de-4499-9c3c-c11073d02f8e\") " pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.125953 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdcceccb-6413-4cf8-972e-7744b99c626e-operator-scripts\") pod \"nova-cell1-ea40-account-create-update-7bp9g\" (UID: \"cdcceccb-6413-4cf8-972e-7744b99c626e\") " pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.125984 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76m5t\" (UniqueName: \"kubernetes.io/projected/9d5850b9-d946-4b1a-9171-718243c78596-kube-api-access-76m5t\") pod \"nova-api-cdcf-account-create-update-qh8rx\" (UID: \"9d5850b9-d946-4b1a-9171-718243c78596\") " pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.126040 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rtgk\" (UniqueName: \"kubernetes.io/projected/cdcceccb-6413-4cf8-972e-7744b99c626e-kube-api-access-5rtgk\") pod \"nova-cell1-ea40-account-create-update-7bp9g\" (UID: \"cdcceccb-6413-4cf8-972e-7744b99c626e\") " pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.126969 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5850b9-d946-4b1a-9171-718243c78596-operator-scripts\") pod \"nova-api-cdcf-account-create-update-qh8rx\" (UID: \"9d5850b9-d946-4b1a-9171-718243c78596\") " pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.127025 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.127065 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:43.127051382 +0000 UTC m=+1530.368931990 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-config" not found Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.127515 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.127540 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:43.127531805 +0000 UTC m=+1530.369412413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-scripts" not found Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.128332 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.128372 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="ovn-northd" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.128538 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a946243a-c6de-4499-9c3c-c11073d02f8e-operator-scripts\") pod \"nova-cell0-dff6-account-create-update-8l4wr\" (UID: \"a946243a-c6de-4499-9c3c-c11073d02f8e\") " pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.134533 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-45c6-account-create-update-cz7k8"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.147620 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-7bp9g"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.195070 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-q9dwb"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.222816 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-q9dwb"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.227478 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdcceccb-6413-4cf8-972e-7744b99c626e-operator-scripts\") pod \"nova-cell1-ea40-account-create-update-7bp9g\" (UID: \"cdcceccb-6413-4cf8-972e-7744b99c626e\") " pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.227570 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rtgk\" (UniqueName: \"kubernetes.io/projected/cdcceccb-6413-4cf8-972e-7744b99c626e-kube-api-access-5rtgk\") pod \"nova-cell1-ea40-account-create-update-7bp9g\" (UID: \"cdcceccb-6413-4cf8-972e-7744b99c626e\") " pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.229123 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdcceccb-6413-4cf8-972e-7744b99c626e-operator-scripts\") pod \"nova-cell1-ea40-account-create-update-7bp9g\" (UID: \"cdcceccb-6413-4cf8-972e-7744b99c626e\") " pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.277517 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76m5t\" (UniqueName: \"kubernetes.io/projected/9d5850b9-d946-4b1a-9171-718243c78596-kube-api-access-76m5t\") pod \"nova-api-cdcf-account-create-update-qh8rx\" (UID: \"9d5850b9-d946-4b1a-9171-718243c78596\") " pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.343574 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rtgk\" (UniqueName: \"kubernetes.io/projected/cdcceccb-6413-4cf8-972e-7744b99c626e-kube-api-access-5rtgk\") pod \"nova-cell1-ea40-account-create-update-7bp9g\" (UID: \"cdcceccb-6413-4cf8-972e-7744b99c626e\") " pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.367438 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.418078 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.442758 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.450126 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.480727 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-5wz82"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.585048 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.588913 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwxxg\" (UniqueName: \"kubernetes.io/projected/a946243a-c6de-4499-9c3c-c11073d02f8e-kube-api-access-qwxxg\") pod \"nova-cell0-dff6-account-create-update-8l4wr\" (UID: \"a946243a-c6de-4499-9c3c-c11073d02f8e\") " pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.600067 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="ovsdbserver-sb" containerID="cri-o://27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9" gracePeriod=300 Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.601194 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-wgnf2"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.609711 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-wgnf2" podUID="e61256d0-9ca6-4524-a19d-7efd32ab9724" containerName="openstack-network-exporter" containerID="cri-o://1bb99faa0c9dbd8195614221f50fa8ce1965d14085b52684adacebddfa8030f0" gracePeriod=30 Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.632689 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fbfnm"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.675521 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9e8c-account-create-update-prqsg"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.703460 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9e8c-account-create-update-prqsg"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.716321 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tbkwn"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.727941 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-qnmrj"] Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.737765 4919 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.737991 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tbkwn"] Mar 10 22:15:42 crc kubenswrapper[4919]: E0310 22:15:42.738705 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data podName:3fe05756-9202-4514-8eea-0c786a2b6d56 nodeName:}" failed. No retries permitted until 2026-03-10 22:15:43.238587361 +0000 UTC m=+1530.480467969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data") pod "rabbitmq-cell1-server-0" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56") : configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.748667 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-qnmrj"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.754583 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-t7ns5"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.767370 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-7xmvf"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.783379 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-7xmvf"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.793546 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-t7ns5"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.801936 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.806783 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-nbpbz"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.807034 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" podUID="37dac1c8-963f-466f-977e-37b2fd98d32c" containerName="dnsmasq-dns" containerID="cri-o://e21cf6248decf29aa992c186881d057311c63ce4f642d28d2d190eaeaf041479" gracePeriod=10 Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.815462 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-mnrpf"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.844083 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-mnrpf"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.847374 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jbk56"] Mar 10 22:15:42 crc kubenswrapper[4919]: I0310 22:15:42.907748 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jbk56"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.051486 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rcb6h"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.095931 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerID="95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b" exitCode=2 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.096016 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b","Type":"ContainerDied","Data":"95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b"} Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.132799 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rcb6h"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.140611 4919 generic.go:334] "Generic (PLEG): container finished" podID="37dac1c8-963f-466f-977e-37b2fd98d32c" containerID="e21cf6248decf29aa992c186881d057311c63ce4f642d28d2d190eaeaf041479" exitCode=0 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.140731 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" event={"ID":"37dac1c8-963f-466f-977e-37b2fd98d32c","Type":"ContainerDied","Data":"e21cf6248decf29aa992c186881d057311c63ce4f642d28d2d190eaeaf041479"} Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.162478 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.162740 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:45.162723772 +0000 UTC m=+1532.404604380 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-scripts" not found Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.162876 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.162959 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:45.162950528 +0000 UTC m=+1532.404831136 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-config" not found Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.172712 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-854d8d6bf4-kknjq"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.172949 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-854d8d6bf4-kknjq" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerName="placement-log" containerID="cri-o://e4d70f78ccff4f0649cd6b2f0b66c626faab3e22bb0695b32e2d8f790b8b831d" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.173038 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-854d8d6bf4-kknjq" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerName="placement-api" containerID="cri-o://d88c0958bf40600808f9977f231a5fa1e34419ab9909560a692746f53c31f4f0" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.173469 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8782e985-ff23-4580-bdfb-ef2dd9b540bc/ovsdbserver-sb/0.log" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.173503 4919 generic.go:334] "Generic (PLEG): container finished" podID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerID="fa482cd6ab218644c93efa05d2493a56acc39a947142b9aa93cab897fbd8faee" exitCode=2 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.173515 4919 generic.go:334] "Generic (PLEG): container finished" podID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerID="27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9" exitCode=143 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.173573 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8782e985-ff23-4580-bdfb-ef2dd9b540bc","Type":"ContainerDied","Data":"fa482cd6ab218644c93efa05d2493a56acc39a947142b9aa93cab897fbd8faee"} Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.173591 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8782e985-ff23-4580-bdfb-ef2dd9b540bc","Type":"ContainerDied","Data":"27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9"} Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.185227 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jm7n6"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.216249 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-jm7n6"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.226928 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9e69198f-c4ad-40c4-b0f4-1a6e9dd17940/ovsdbserver-nb/0.log" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.227150 4919 generic.go:334] "Generic (PLEG): container finished" podID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerID="7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44" exitCode=143 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.227226 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940","Type":"ContainerDied","Data":"7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44"} Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.253125 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-x2nwk"] Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.263748 4919 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.263948 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data podName:3fe05756-9202-4514-8eea-0c786a2b6d56 nodeName:}" failed. No retries permitted until 2026-03-10 22:15:44.263927954 +0000 UTC m=+1531.505808562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data") pod "rabbitmq-cell1-server-0" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56") : configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.271576 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-x2nwk"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.281663 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.282268 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-server" containerID="cri-o://d87f46352498df2659c2cb9f7933207812d2b038ce489fc1d9131bb625190926" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.282771 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="swift-recon-cron" containerID="cri-o://7d6f077cbbd4ed720f528f14962aa759a22f7956036b9e97a87b6414a73da0ba" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.282828 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="rsync" containerID="cri-o://d4f089226b859cf9e472a23e4abfff98df12043dbab19a145b4c4fdfb8923fe6" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.282872 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-expirer" containerID="cri-o://b9b5e9d2ec6d050219cb0112cd09d62a15be687c7cb44e610e55e8bc795ce60f" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.282917 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-updater" containerID="cri-o://ae368876337c9d3b40fae442133c17eb2b857ea33f295ec663500c6b4ebd5fb3" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.282953 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-auditor" containerID="cri-o://48c8ed23f216106829b3f2774da116741d89515f11a103cf91248086130b0d18" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.282994 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-replicator" containerID="cri-o://fe3a820aaaecc3dbdc2c00e963ea20455282c2cd0ff782ad5b053eaa18c2a728" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.283052 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-auditor" containerID="cri-o://f84605aa41e553170463de5ffc1ffb79d9063aeb307fca1c7396bcab45897e17" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.283064 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-server" containerID="cri-o://75261d329b8223b1135cf1458a80f97a9d45e26831f1353325eb35731b37f5d2" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.283092 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-replicator" containerID="cri-o://2c39ba342378f6fa53b3c9079da22731dd39cfe3e751a36987c68151f2ee3a38" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.283232 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-auditor" containerID="cri-o://5f25a2d98dbdc44c683281bef776d43cbc4121e4fa9cb5254edd868686128030" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.283280 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-server" containerID="cri-o://1ab1280d2d068201d8bde5433767f956a9a5d9d033ba36ade9a16c74b928af27" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.283320 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-updater" containerID="cri-o://ea01dce7b06601c355d1e0c1f5bc23af1e7381afe0f09aa8a98efd1dceeab0e9" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.283612 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-replicator" containerID="cri-o://8e08fa3055d8aa50b60d141bc84fe0f51f25e9fc45728c0dd1491d1bc7a66860" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.283714 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-reaper" containerID="cri-o://8406c65e24e9f21076831c641cab8e21b69c623c1d92a90d609d4a3c30670852" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.328781 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-j6dcn"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.351547 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-j6dcn"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.359079 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-tkhg8"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.368467 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.368735 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="cinder-scheduler" containerID="cri-o://8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.369210 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="probe" containerID="cri-o://3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.383804 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-tkhg8"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.405560 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.405892 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api-log" containerID="cri-o://160abe754665fdfbb180f5bc071ae95530a567d07d30547bc0bedcf6ffce0c0d" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.406348 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api" containerID="cri-o://df8f79c23e11b14d9212f9cd7c7b374f297dc4c6a1b8f62a2988cc7af5ea3b27" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.412851 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jdgb7"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.415819 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.173:8776/healthcheck\": EOF" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.424782 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jdgb7"] Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.436176 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9 is running failed: container process not found" containerID="27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.440858 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9 is running failed: container process not found" containerID="27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.464186 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9 is running failed: container process not found" containerID="27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.464239 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="ovsdbserver-sb" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.666686 4919 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutron-846dbc6cd5-kg4kx" secret="" err="secret \"neutron-neutron-dockercfg-phplg\" not found" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.719480 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0067a7fe-f5db-4832-a519-848ac8b771c0" path="/var/lib/kubelet/pods/0067a7fe-f5db-4832-a519-848ac8b771c0/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.725282 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d206af-330a-4526-8a3e-7826a1acb153" path="/var/lib/kubelet/pods/02d206af-330a-4526-8a3e-7826a1acb153/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.726676 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0376622a-15ed-42d8-98b9-ffa1138134ee" path="/var/lib/kubelet/pods/0376622a-15ed-42d8-98b9-ffa1138134ee/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.727332 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5" path="/var/lib/kubelet/pods/14a85d0b-0c54-4da3-8646-b3e0ca6fe5d5/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.739438 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15160303-4913-49a5-8cd3-e8255ba657f6" path="/var/lib/kubelet/pods/15160303-4913-49a5-8cd3-e8255ba657f6/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.754565 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e6114e-103b-4653-b4f0-ba3c216e3437" path="/var/lib/kubelet/pods/31e6114e-103b-4653-b4f0-ba3c216e3437/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.755176 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9a0463-5f7a-4164-9fd7-a7bce608bf41" path="/var/lib/kubelet/pods/3a9a0463-5f7a-4164-9fd7-a7bce608bf41/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.755673 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48251540-9da9-4f40-b01f-27188fe69056" path="/var/lib/kubelet/pods/48251540-9da9-4f40-b01f-27188fe69056/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.769659 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5384b251-66f7-451a-ab29-0b88b8207838" path="/var/lib/kubelet/pods/5384b251-66f7-451a-ab29-0b88b8207838/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.788285 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c9d4c1-5253-49d5-8ade-272d01956b72" path="/var/lib/kubelet/pods/53c9d4c1-5253-49d5-8ade-272d01956b72/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.789244 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da154ef-0415-4972-8282-cf5161c8fa71" path="/var/lib/kubelet/pods/6da154ef-0415-4972-8282-cf5161c8fa71/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.799894 4919 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.799976 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:44.299957037 +0000 UTC m=+1531.541837645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-config" not found Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.800798 4919 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Mar 10 22:15:43 crc kubenswrapper[4919]: E0310 22:15:43.800874 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:44.30083114 +0000 UTC m=+1531.542711748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-httpd-config" not found Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.806725 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fdec3e1-893d-44ec-bd70-90c66e304ba7" path="/var/lib/kubelet/pods/7fdec3e1-893d-44ec-bd70-90c66e304ba7/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.810442 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8662d67d-6dbb-4156-8a34-a13e650bb745" path="/var/lib/kubelet/pods/8662d67d-6dbb-4156-8a34-a13e650bb745/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.822855 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f16822b-b7de-48ca-8d05-938c50f0837d" path="/var/lib/kubelet/pods/8f16822b-b7de-48ca-8d05-938c50f0837d/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.841978 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d007fb-1bda-48dd-ad03-6601dc770a2a" path="/var/lib/kubelet/pods/a4d007fb-1bda-48dd-ad03-6601dc770a2a/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.842565 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15" path="/var/lib/kubelet/pods/dbaea2e3-ed2a-41ca-96d1-9a837b1b2b15/volumes" Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843073 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-846dbc6cd5-kg4kx"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843100 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7whbn"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843111 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7whbn"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843124 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8e40-account-create-update-kkddk"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843137 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zcrjg"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843145 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zcrjg"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843155 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4bjxd"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843163 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9e8c-account-create-update-6q4j2"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843174 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-45c6-account-create-update-zczhm"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843183 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4bjxd"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843192 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843201 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843210 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843776 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-log" containerID="cri-o://77810b20846ff06e6e2286529394046c226fcdd21d8e8615e097fa35b3579513" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.843899 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-metadata" containerID="cri-o://0d25c07eec1b4976670c75603fd5da5476a97bed8734c71857c7dda9c1fa75bb" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.844001 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerName="glance-log" containerID="cri-o://ceb6023d0f542d943ccfa4398a55aeeb75cf652b0f4a2b2be0237840184075d5" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.844059 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerName="glance-httpd" containerID="cri-o://79e87bdd987eb81ea9f7ad47745afc59d0dd4ce7a69aa1af13ca054411b4739c" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.848802 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.849067 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerName="glance-log" containerID="cri-o://0fa75078e11e8939f0a39526b2508ccb8e7f4c3ea23641588f1b3b509c8c8e82" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.849194 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerName="glance-httpd" containerID="cri-o://22265554a653026f7008b3a597b22efe9ebe95b2013255f792f08efd3682fc62" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.895633 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.910521 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-fd8f54c58-gtj5m"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.910841 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" containerName="barbican-keystone-listener-log" containerID="cri-o://94166ce461bf0d2113bc5e17cabdb4512063e9d512e978c63a5719892a6a8251" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.910979 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" containerName="barbican-keystone-listener" containerID="cri-o://800bb678ea7e53aea235c1816dd3b24e1d5cc3ca3910d7d45b290926f9b56fcd" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.925989 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-68449cb44c-wmmzf"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.926202 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-68449cb44c-wmmzf" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerName="barbican-worker-log" containerID="cri-o://3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.926355 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-68449cb44c-wmmzf" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerName="barbican-worker" containerID="cri-o://46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.941073 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-qh8rx"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.954188 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-7bp9g"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.967534 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.967812 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-log" containerID="cri-o://9f98722ad0d1eaba724d3a1905d648a1f6aa5618c0b6cbe31210b8bf1f36f489" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.968280 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-api" containerID="cri-o://6998100dd91ae8a0c4934a4b8c43b07c72cd35d8be7e9f5c1635f9179079c2ed" gracePeriod=30 Mar 10 22:15:43 crc kubenswrapper[4919]: I0310 22:15:43.996221 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" containerName="rabbitmq" containerID="cri-o://8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d" gracePeriod=604800 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.040456 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-8585d"] Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.051869 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44 is running failed: container process not found" containerID="7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.061770 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44 is running failed: container process not found" containerID="7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.062817 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44 is running failed: container process not found" containerID="7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.062843 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="ovsdbserver-nb" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.065576 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-8585d"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.077531 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-pp7db"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.084411 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-pp7db"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.096442 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-8l4wr"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.099428 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75f54b97c6-fj5s7"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.099654 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75f54b97c6-fj5s7" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api-log" containerID="cri-o://4f419ebd70f99390116647c66037cbdcde78f060d7d9ba4c1e4bafbc7b53452c" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.100065 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75f54b97c6-fj5s7" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api" containerID="cri-o://fca95d527891107e2d3047ae093c871e26b3b61e0a45e89186497f024bbb6624" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.111455 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nrfll"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.120069 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.120282 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="aff6348f-a0cf-4b67-a072-edcde9dcb3c4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.130297 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nrfll"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.137479 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.145544 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8e40-account-create-update-kkddk"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.158252 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xlzz4"] Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.203691 4919 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 10 22:15:44 crc kubenswrapper[4919]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 22:15:44 crc kubenswrapper[4919]: + source /usr/local/bin/container-scripts/functions Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNBridge=br-int Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNRemote=tcp:localhost:6642 Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNEncapType=geneve Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNAvailabilityZones= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ EnableChassisAsGateway=true Mar 10 22:15:44 crc kubenswrapper[4919]: ++ PhysicalNetworks= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNHostName= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 22:15:44 crc kubenswrapper[4919]: ++ ovs_dir=/var/lib/openvswitch Mar 10 22:15:44 crc kubenswrapper[4919]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 22:15:44 crc kubenswrapper[4919]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 22:15:44 crc kubenswrapper[4919]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + sleep 0.5 Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + sleep 0.5 Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + cleanup_ovsdb_server_semaphore Mar 10 22:15:44 crc kubenswrapper[4919]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 22:15:44 crc kubenswrapper[4919]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 22:15:44 crc kubenswrapper[4919]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-5wz82" message=< Mar 10 22:15:44 crc kubenswrapper[4919]: Exiting ovsdb-server (5) [ OK ] Mar 10 22:15:44 crc kubenswrapper[4919]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 22:15:44 crc kubenswrapper[4919]: + source /usr/local/bin/container-scripts/functions Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNBridge=br-int Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNRemote=tcp:localhost:6642 Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNEncapType=geneve Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNAvailabilityZones= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ EnableChassisAsGateway=true Mar 10 22:15:44 crc kubenswrapper[4919]: ++ PhysicalNetworks= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNHostName= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 22:15:44 crc kubenswrapper[4919]: ++ ovs_dir=/var/lib/openvswitch Mar 10 22:15:44 crc kubenswrapper[4919]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 22:15:44 crc kubenswrapper[4919]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 22:15:44 crc kubenswrapper[4919]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + sleep 0.5 Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + sleep 0.5 Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + cleanup_ovsdb_server_semaphore Mar 10 22:15:44 crc kubenswrapper[4919]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 22:15:44 crc kubenswrapper[4919]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 22:15:44 crc kubenswrapper[4919]: > Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.204021 4919 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 10 22:15:44 crc kubenswrapper[4919]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 22:15:44 crc kubenswrapper[4919]: + source /usr/local/bin/container-scripts/functions Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNBridge=br-int Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNRemote=tcp:localhost:6642 Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNEncapType=geneve Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNAvailabilityZones= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ EnableChassisAsGateway=true Mar 10 22:15:44 crc kubenswrapper[4919]: ++ PhysicalNetworks= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ OVNHostName= Mar 10 22:15:44 crc kubenswrapper[4919]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 22:15:44 crc kubenswrapper[4919]: ++ ovs_dir=/var/lib/openvswitch Mar 10 22:15:44 crc kubenswrapper[4919]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 22:15:44 crc kubenswrapper[4919]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 22:15:44 crc kubenswrapper[4919]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + sleep 0.5 Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + sleep 0.5 Mar 10 22:15:44 crc kubenswrapper[4919]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 22:15:44 crc kubenswrapper[4919]: + cleanup_ovsdb_server_semaphore Mar 10 22:15:44 crc kubenswrapper[4919]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 22:15:44 crc kubenswrapper[4919]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 22:15:44 crc kubenswrapper[4919]: > pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" containerID="cri-o://85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.204060 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" containerID="cri-o://85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" gracePeriod=29 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.230143 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.231029 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9700fb27-6a74-428d-a2e6-71c237b3e054" containerName="nova-scheduler-scheduler" containerID="cri-o://8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.254484 4919 generic.go:334] "Generic (PLEG): container finished" podID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerID="3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.254566 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68449cb44c-wmmzf" event={"ID":"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0","Type":"ContainerDied","Data":"3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.260342 4919 generic.go:334] "Generic (PLEG): container finished" podID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerID="e4d70f78ccff4f0649cd6b2f0b66c626faab3e22bb0695b32e2d8f790b8b831d" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.260410 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854d8d6bf4-kknjq" event={"ID":"981bb03c-23be-4bf8-a9f6-cb8a552f66a5","Type":"ContainerDied","Data":"e4d70f78ccff4f0649cd6b2f0b66c626faab3e22bb0695b32e2d8f790b8b831d"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.266810 4919 generic.go:334] "Generic (PLEG): container finished" podID="a525725f-407a-4e99-96a1-a0eaba714487" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.266895 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5wz82" event={"ID":"a525725f-407a-4e99-96a1-a0eaba714487","Type":"ContainerDied","Data":"85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.273499 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" event={"ID":"37dac1c8-963f-466f-977e-37b2fd98d32c","Type":"ContainerDied","Data":"cb55c7a40740c9b346700a78eed74baf6304c24976baed861b49a89477f86eb7"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.273533 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb55c7a40740c9b346700a78eed74baf6304c24976baed861b49a89477f86eb7" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.275181 4919 generic.go:334] "Generic (PLEG): container finished" podID="6bff1404-f9b1-48f8-b093-95c3bb206c6a" containerID="9356369f3992f46c3635f399cef0e5e3d0db351590dc5b463547b642e4ea9e30" exitCode=137 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.277603 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" containerID="cri-o://e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" gracePeriod=29 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.277647 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9e69198f-c4ad-40c4-b0f4-1a6e9dd17940/ovsdbserver-nb/0.log" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.277707 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940","Type":"ContainerDied","Data":"3113a147d726107b80aa072cdf5b7d9c13082cb21a706f52b9eed76ba756222d"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.277726 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3113a147d726107b80aa072cdf5b7d9c13082cb21a706f52b9eed76ba756222d" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.301276 4919 generic.go:334] "Generic (PLEG): container finished" podID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerID="4f419ebd70f99390116647c66037cbdcde78f060d7d9ba4c1e4bafbc7b53452c" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.301437 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f54b97c6-fj5s7" event={"ID":"28d81dfb-640f-4748-ab70-e0b393e1e595","Type":"ContainerDied","Data":"4f419ebd70f99390116647c66037cbdcde78f060d7d9ba4c1e4bafbc7b53452c"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.303771 4919 generic.go:334] "Generic (PLEG): container finished" podID="31690f34-6b68-4470-a13e-e16121ec25d2" containerID="94166ce461bf0d2113bc5e17cabdb4512063e9d512e978c63a5719892a6a8251" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.303822 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" event={"ID":"31690f34-6b68-4470-a13e-e16121ec25d2","Type":"ContainerDied","Data":"94166ce461bf0d2113bc5e17cabdb4512063e9d512e978c63a5719892a6a8251"} Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.324943 4919 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.325002 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data podName:3fe05756-9202-4514-8eea-0c786a2b6d56 nodeName:}" failed. No retries permitted until 2026-03-10 22:15:46.324988302 +0000 UTC m=+1533.566868900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data") pod "rabbitmq-cell1-server-0" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56") : configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.325353 4919 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.325376 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:45.325369392 +0000 UTC m=+1532.567250000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-config" not found Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.325505 4919 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.325526 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:45.325519706 +0000 UTC m=+1532.567400314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-httpd-config" not found Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.341060 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerName="rabbitmq" containerID="cri-o://574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913" gracePeriod=604800 Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.350987 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 22:15:44 crc kubenswrapper[4919]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 22:15:44 crc kubenswrapper[4919]: Mar 10 22:15:44 crc kubenswrapper[4919]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 22:15:44 crc kubenswrapper[4919]: Mar 10 22:15:44 crc kubenswrapper[4919]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 22:15:44 crc kubenswrapper[4919]: Mar 10 22:15:44 crc kubenswrapper[4919]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 22:15:44 crc kubenswrapper[4919]: Mar 10 22:15:44 crc kubenswrapper[4919]: if [ -n "barbican" ]; then Mar 10 22:15:44 crc kubenswrapper[4919]: GRANT_DATABASE="barbican" Mar 10 22:15:44 crc kubenswrapper[4919]: else Mar 10 22:15:44 crc kubenswrapper[4919]: GRANT_DATABASE="*" Mar 10 22:15:44 crc kubenswrapper[4919]: fi Mar 10 22:15:44 crc kubenswrapper[4919]: Mar 10 22:15:44 crc kubenswrapper[4919]: # going for maximum compatibility here: Mar 10 22:15:44 crc kubenswrapper[4919]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 22:15:44 crc kubenswrapper[4919]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 22:15:44 crc kubenswrapper[4919]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 22:15:44 crc kubenswrapper[4919]: # support updates Mar 10 22:15:44 crc kubenswrapper[4919]: Mar 10 22:15:44 crc kubenswrapper[4919]: $MYSQL_CMD < logger="UnhandledError" Mar 10 22:15:44 crc kubenswrapper[4919]: E0310 22:15:44.352118 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-8e40-account-create-update-kkddk" podUID="ec49f65c-e8af-44a1-b464-af2a86b299fc" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.359371 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9e69198f-c4ad-40c4-b0f4-1a6e9dd17940/ovsdbserver-nb/0.log" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.359489 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.373044 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376632 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="d4f089226b859cf9e472a23e4abfff98df12043dbab19a145b4c4fdfb8923fe6" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376676 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="b9b5e9d2ec6d050219cb0112cd09d62a15be687c7cb44e610e55e8bc795ce60f" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376688 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="ae368876337c9d3b40fae442133c17eb2b857ea33f295ec663500c6b4ebd5fb3" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376696 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="48c8ed23f216106829b3f2774da116741d89515f11a103cf91248086130b0d18" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376705 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="fe3a820aaaecc3dbdc2c00e963ea20455282c2cd0ff782ad5b053eaa18c2a728" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376713 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="1ab1280d2d068201d8bde5433767f956a9a5d9d033ba36ade9a16c74b928af27" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376719 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="ea01dce7b06601c355d1e0c1f5bc23af1e7381afe0f09aa8a98efd1dceeab0e9" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376725 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="5f25a2d98dbdc44c683281bef776d43cbc4121e4fa9cb5254edd868686128030" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376731 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="8e08fa3055d8aa50b60d141bc84fe0f51f25e9fc45728c0dd1491d1bc7a66860" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376737 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="75261d329b8223b1135cf1458a80f97a9d45e26831f1353325eb35731b37f5d2" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376744 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="8406c65e24e9f21076831c641cab8e21b69c623c1d92a90d609d4a3c30670852" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376752 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="f84605aa41e553170463de5ffc1ffb79d9063aeb307fca1c7396bcab45897e17" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376758 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="2c39ba342378f6fa53b3c9079da22731dd39cfe3e751a36987c68151f2ee3a38" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376764 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="d87f46352498df2659c2cb9f7933207812d2b038ce489fc1d9131bb625190926" exitCode=0 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376842 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"d4f089226b859cf9e472a23e4abfff98df12043dbab19a145b4c4fdfb8923fe6"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376879 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"b9b5e9d2ec6d050219cb0112cd09d62a15be687c7cb44e610e55e8bc795ce60f"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376899 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"ae368876337c9d3b40fae442133c17eb2b857ea33f295ec663500c6b4ebd5fb3"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376913 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"48c8ed23f216106829b3f2774da116741d89515f11a103cf91248086130b0d18"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376924 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"fe3a820aaaecc3dbdc2c00e963ea20455282c2cd0ff782ad5b053eaa18c2a728"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376935 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"1ab1280d2d068201d8bde5433767f956a9a5d9d033ba36ade9a16c74b928af27"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376945 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"ea01dce7b06601c355d1e0c1f5bc23af1e7381afe0f09aa8a98efd1dceeab0e9"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376956 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"5f25a2d98dbdc44c683281bef776d43cbc4121e4fa9cb5254edd868686128030"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376967 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"8e08fa3055d8aa50b60d141bc84fe0f51f25e9fc45728c0dd1491d1bc7a66860"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376978 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"75261d329b8223b1135cf1458a80f97a9d45e26831f1353325eb35731b37f5d2"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.376989 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"8406c65e24e9f21076831c641cab8e21b69c623c1d92a90d609d4a3c30670852"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.377005 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"f84605aa41e553170463de5ffc1ffb79d9063aeb307fca1c7396bcab45897e17"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.377019 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"2c39ba342378f6fa53b3c9079da22731dd39cfe3e751a36987c68151f2ee3a38"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.377030 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"d87f46352498df2659c2cb9f7933207812d2b038ce489fc1d9131bb625190926"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.403114 4919 generic.go:334] "Generic (PLEG): container finished" podID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerID="160abe754665fdfbb180f5bc071ae95530a567d07d30547bc0bedcf6ffce0c0d" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.403261 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4865c8ed-670d-41a0-b9fc-ba7697085e6b","Type":"ContainerDied","Data":"160abe754665fdfbb180f5bc071ae95530a567d07d30547bc0bedcf6ffce0c0d"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.439938 4919 generic.go:334] "Generic (PLEG): container finished" podID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerID="ceb6023d0f542d943ccfa4398a55aeeb75cf652b0f4a2b2be0237840184075d5" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.440158 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"91a933f1-aa44-4375-8f5c-e5f3567e6c8e","Type":"ContainerDied","Data":"ceb6023d0f542d943ccfa4398a55aeeb75cf652b0f4a2b2be0237840184075d5"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.451684 4919 generic.go:334] "Generic (PLEG): container finished" podID="515105ef-e538-4276-b682-7e05881dc7e8" containerID="9f98722ad0d1eaba724d3a1905d648a1f6aa5618c0b6cbe31210b8bf1f36f489" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.451793 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"515105ef-e538-4276-b682-7e05881dc7e8","Type":"ContainerDied","Data":"9f98722ad0d1eaba724d3a1905d648a1f6aa5618c0b6cbe31210b8bf1f36f489"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.497869 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wgnf2_e61256d0-9ca6-4524-a19d-7efd32ab9724/openstack-network-exporter/0.log" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.497926 4919 generic.go:334] "Generic (PLEG): container finished" podID="e61256d0-9ca6-4524-a19d-7efd32ab9724" containerID="1bb99faa0c9dbd8195614221f50fa8ce1965d14085b52684adacebddfa8030f0" exitCode=2 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.498013 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wgnf2" event={"ID":"e61256d0-9ca6-4524-a19d-7efd32ab9724","Type":"ContainerDied","Data":"1bb99faa0c9dbd8195614221f50fa8ce1965d14085b52684adacebddfa8030f0"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.516526 4919 generic.go:334] "Generic (PLEG): container finished" podID="81489e39-0246-4065-8835-31b1e5da8431" containerID="77810b20846ff06e6e2286529394046c226fcdd21d8e8615e097fa35b3579513" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.516633 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81489e39-0246-4065-8835-31b1e5da8431","Type":"ContainerDied","Data":"77810b20846ff06e6e2286529394046c226fcdd21d8e8615e097fa35b3579513"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537155 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-metrics-certs-tls-certs\") pod \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537255 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537309 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvpdr\" (UniqueName: \"kubernetes.io/projected/37dac1c8-963f-466f-977e-37b2fd98d32c-kube-api-access-gvpdr\") pod \"37dac1c8-963f-466f-977e-37b2fd98d32c\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537462 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-scripts\") pod \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537512 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-config\") pod \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537545 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-svc\") pod \"37dac1c8-963f-466f-977e-37b2fd98d32c\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537588 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzs87\" (UniqueName: \"kubernetes.io/projected/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-kube-api-access-xzs87\") pod \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537708 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-sb\") pod \"37dac1c8-963f-466f-977e-37b2fd98d32c\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.537748 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdb-rundir\") pod \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.538699 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-config\") pod \"37dac1c8-963f-466f-977e-37b2fd98d32c\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.538791 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-nb\") pod \"37dac1c8-963f-466f-977e-37b2fd98d32c\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.538828 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-swift-storage-0\") pod \"37dac1c8-963f-466f-977e-37b2fd98d32c\" (UID: \"37dac1c8-963f-466f-977e-37b2fd98d32c\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.538854 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdbserver-nb-tls-certs\") pod \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.538886 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-combined-ca-bundle\") pod \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\" (UID: \"9e69198f-c4ad-40c4-b0f4-1a6e9dd17940\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.548346 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" (UID: "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.548673 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37dac1c8-963f-466f-977e-37b2fd98d32c-kube-api-access-gvpdr" (OuterVolumeSpecName: "kube-api-access-gvpdr") pod "37dac1c8-963f-466f-977e-37b2fd98d32c" (UID: "37dac1c8-963f-466f-977e-37b2fd98d32c"). InnerVolumeSpecName "kube-api-access-gvpdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.550934 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-scripts" (OuterVolumeSpecName: "scripts") pod "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" (UID: "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.553018 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-config" (OuterVolumeSpecName: "config") pod "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" (UID: "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.554249 4919 generic.go:334] "Generic (PLEG): container finished" podID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerID="0fa75078e11e8939f0a39526b2508ccb8e7f4c3ea23641588f1b3b509c8c8e82" exitCode=143 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.554686 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-846dbc6cd5-kg4kx" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerName="neutron-api" containerID="cri-o://18dd8dafec6aecc8efed736cca0a71f9d1628505bf0335dc2c01e5a11aba23af" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.554961 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab479995-b87a-46b8-9a4e-d9e95d556775","Type":"ContainerDied","Data":"0fa75078e11e8939f0a39526b2508ccb8e7f4c3ea23641588f1b3b509c8c8e82"} Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.555373 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-846dbc6cd5-kg4kx" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerName="neutron-httpd" containerID="cri-o://9cdb7599c01cdc95ab93aaa9cd850cf9d1c5bc23e81939b0310b3ed9a9214bc6" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.568750 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-kube-api-access-xzs87" (OuterVolumeSpecName: "kube-api-access-xzs87") pod "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" (UID: "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940"). InnerVolumeSpecName "kube-api-access-xzs87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.582300 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lxbqg"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.587461 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" (UID: "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.592920 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.593132 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="62814b8d-8679-4350-be7d-5f729f901846" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.605380 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lxbqg"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.614129 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.614336 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="4a7ad3ed-9144-4a21-808c-23d613354a2f" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.621192 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" (UID: "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.640109 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zr9cc"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.642039 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.642078 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.642106 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.642117 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvpdr\" (UniqueName: \"kubernetes.io/projected/37dac1c8-963f-466f-977e-37b2fd98d32c-kube-api-access-gvpdr\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.642129 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.642140 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.642150 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzs87\" (UniqueName: \"kubernetes.io/projected/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-kube-api-access-xzs87\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.647057 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zr9cc"] Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.657258 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="37ef9179-69db-49ab-a4e6-2e2b815fc260" containerName="galera" containerID="cri-o://3a27cd1d1040a8d26e69883aaeb7f8132c3852f88812a287ec745156fd408f80" gracePeriod=30 Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.678894 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.690289 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37dac1c8-963f-466f-977e-37b2fd98d32c" (UID: "37dac1c8-963f-466f-977e-37b2fd98d32c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.690919 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37dac1c8-963f-466f-977e-37b2fd98d32c" (UID: "37dac1c8-963f-466f-977e-37b2fd98d32c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.699737 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-config" (OuterVolumeSpecName: "config") pod "37dac1c8-963f-466f-977e-37b2fd98d32c" (UID: "37dac1c8-963f-466f-977e-37b2fd98d32c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.736945 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" (UID: "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.739466 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" (UID: "9e69198f-c4ad-40c4-b0f4-1a6e9dd17940"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.744282 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.744304 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.744313 4919 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.744323 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.744332 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.744340 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.757492 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37dac1c8-963f-466f-977e-37b2fd98d32c" (UID: "37dac1c8-963f-466f-977e-37b2fd98d32c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.771344 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wgnf2_e61256d0-9ca6-4524-a19d-7efd32ab9724/openstack-network-exporter/0.log" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.771757 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.779540 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.801347 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37dac1c8-963f-466f-977e-37b2fd98d32c" (UID: "37dac1c8-963f-466f-977e-37b2fd98d32c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.802514 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8782e985-ff23-4580-bdfb-ef2dd9b540bc/ovsdbserver-sb/0.log" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.802572 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.845050 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-metrics-certs-tls-certs\") pod \"e61256d0-9ca6-4524-a19d-7efd32ab9724\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.845125 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-combined-ca-bundle\") pod \"e61256d0-9ca6-4524-a19d-7efd32ab9724\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.845197 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovs-rundir\") pod \"e61256d0-9ca6-4524-a19d-7efd32ab9724\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.845227 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61256d0-9ca6-4524-a19d-7efd32ab9724-config\") pod \"e61256d0-9ca6-4524-a19d-7efd32ab9724\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.845264 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovn-rundir\") pod \"e61256d0-9ca6-4524-a19d-7efd32ab9724\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.845416 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftt6c\" (UniqueName: \"kubernetes.io/projected/e61256d0-9ca6-4524-a19d-7efd32ab9724-kube-api-access-ftt6c\") pod \"e61256d0-9ca6-4524-a19d-7efd32ab9724\" (UID: \"e61256d0-9ca6-4524-a19d-7efd32ab9724\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.845828 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.845846 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37dac1c8-963f-466f-977e-37b2fd98d32c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.847530 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "e61256d0-9ca6-4524-a19d-7efd32ab9724" (UID: "e61256d0-9ca6-4524-a19d-7efd32ab9724"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.849083 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "e61256d0-9ca6-4524-a19d-7efd32ab9724" (UID: "e61256d0-9ca6-4524-a19d-7efd32ab9724"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.849255 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61256d0-9ca6-4524-a19d-7efd32ab9724-kube-api-access-ftt6c" (OuterVolumeSpecName: "kube-api-access-ftt6c") pod "e61256d0-9ca6-4524-a19d-7efd32ab9724" (UID: "e61256d0-9ca6-4524-a19d-7efd32ab9724"). InnerVolumeSpecName "kube-api-access-ftt6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.849317 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61256d0-9ca6-4524-a19d-7efd32ab9724-config" (OuterVolumeSpecName: "config") pod "e61256d0-9ca6-4524-a19d-7efd32ab9724" (UID: "e61256d0-9ca6-4524-a19d-7efd32ab9724"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.877937 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e61256d0-9ca6-4524-a19d-7efd32ab9724" (UID: "e61256d0-9ca6-4524-a19d-7efd32ab9724"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.887617 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.928595 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e61256d0-9ca6-4524-a19d-7efd32ab9724" (UID: "e61256d0-9ca6-4524-a19d-7efd32ab9724"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947004 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-combined-ca-bundle\") pod \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947063 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-combined-ca-bundle\") pod \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947106 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config-secret\") pod \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947131 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htt2d\" (UniqueName: \"kubernetes.io/projected/8782e985-ff23-4580-bdfb-ef2dd9b540bc-kube-api-access-htt2d\") pod \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947191 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config\") pod \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947246 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947305 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc9fw\" (UniqueName: \"kubernetes.io/projected/6bff1404-f9b1-48f8-b093-95c3bb206c6a-kube-api-access-pc9fw\") pod \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\" (UID: \"6bff1404-f9b1-48f8-b093-95c3bb206c6a\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947341 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdbserver-sb-tls-certs\") pod \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947403 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-scripts\") pod \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947466 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-metrics-certs-tls-certs\") pod \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947482 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-config\") pod \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947503 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdb-rundir\") pod \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\" (UID: \"8782e985-ff23-4580-bdfb-ef2dd9b540bc\") " Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.947919 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8782e985-ff23-4580-bdfb-ef2dd9b540bc" (UID: "8782e985-ff23-4580-bdfb-ef2dd9b540bc"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.948476 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-scripts" (OuterVolumeSpecName: "scripts") pod "8782e985-ff23-4580-bdfb-ef2dd9b540bc" (UID: "8782e985-ff23-4580-bdfb-ef2dd9b540bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.948740 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.948821 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.948950 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftt6c\" (UniqueName: \"kubernetes.io/projected/e61256d0-9ca6-4524-a19d-7efd32ab9724-kube-api-access-ftt6c\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.950659 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.950739 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61256d0-9ca6-4524-a19d-7efd32ab9724-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.950800 4919 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.950853 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61256d0-9ca6-4524-a19d-7efd32ab9724-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.951136 4919 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e61256d0-9ca6-4524-a19d-7efd32ab9724-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.951723 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-config" (OuterVolumeSpecName: "config") pod "8782e985-ff23-4580-bdfb-ef2dd9b540bc" (UID: "8782e985-ff23-4580-bdfb-ef2dd9b540bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.963127 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bff1404-f9b1-48f8-b093-95c3bb206c6a-kube-api-access-pc9fw" (OuterVolumeSpecName: "kube-api-access-pc9fw") pod "6bff1404-f9b1-48f8-b093-95c3bb206c6a" (UID: "6bff1404-f9b1-48f8-b093-95c3bb206c6a"). InnerVolumeSpecName "kube-api-access-pc9fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.976098 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "8782e985-ff23-4580-bdfb-ef2dd9b540bc" (UID: "8782e985-ff23-4580-bdfb-ef2dd9b540bc"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.979261 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8782e985-ff23-4580-bdfb-ef2dd9b540bc-kube-api-access-htt2d" (OuterVolumeSpecName: "kube-api-access-htt2d") pod "8782e985-ff23-4580-bdfb-ef2dd9b540bc" (UID: "8782e985-ff23-4580-bdfb-ef2dd9b540bc"). InnerVolumeSpecName "kube-api-access-htt2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:44 crc kubenswrapper[4919]: I0310 22:15:44.990056 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6bff1404-f9b1-48f8-b093-95c3bb206c6a" (UID: "6bff1404-f9b1-48f8-b093-95c3bb206c6a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.010745 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bff1404-f9b1-48f8-b093-95c3bb206c6a" (UID: "6bff1404-f9b1-48f8-b093-95c3bb206c6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.051477 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-fbf4c94d9-4mg9b"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.051753 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-httpd" containerID="cri-o://704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5" gracePeriod=30 Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.052005 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-server" containerID="cri-o://64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3" gracePeriod=30 Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.052585 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84sz2\" (UniqueName: \"kubernetes.io/projected/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-kube-api-access-84sz2\") pod \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.052665 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data-custom\") pod \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.052790 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data\") pod \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.052835 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-combined-ca-bundle\") pod \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.052863 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-logs\") pod \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\" (UID: \"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.053441 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htt2d\" (UniqueName: \"kubernetes.io/projected/8782e985-ff23-4580-bdfb-ef2dd9b540bc-kube-api-access-htt2d\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.053454 4919 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.053472 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.053483 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc9fw\" (UniqueName: \"kubernetes.io/projected/6bff1404-f9b1-48f8-b093-95c3bb206c6a-kube-api-access-pc9fw\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.053492 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782e985-ff23-4580-bdfb-ef2dd9b540bc-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.053502 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.055241 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-logs" (OuterVolumeSpecName: "logs") pod "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" (UID: "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.056378 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8782e985-ff23-4580-bdfb-ef2dd9b540bc" (UID: "8782e985-ff23-4580-bdfb-ef2dd9b540bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.068718 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-kube-api-access-84sz2" (OuterVolumeSpecName: "kube-api-access-84sz2") pod "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" (UID: "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0"). InnerVolumeSpecName "kube-api-access-84sz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.076219 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" (UID: "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.090356 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8782e985-ff23-4580-bdfb-ef2dd9b540bc" (UID: "8782e985-ff23-4580-bdfb-ef2dd9b540bc"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.105239 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.110512 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" (UID: "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.155955 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.159908 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.159980 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.160062 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.160121 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.160197 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.160254 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84sz2\" (UniqueName: \"kubernetes.io/projected/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-kube-api-access-84sz2\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.158614 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6bff1404-f9b1-48f8-b093-95c3bb206c6a" (UID: "6bff1404-f9b1-48f8-b093-95c3bb206c6a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.158994 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data" (OuterVolumeSpecName: "config-data") pod "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" (UID: "7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.159100 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "8782e985-ff23-4580-bdfb-ef2dd9b540bc" (UID: "8782e985-ff23-4580-bdfb-ef2dd9b540bc"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.194707 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.203683 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.176:8080/healthcheck\": dial tcp 10.217.0.176:8080: connect: connection refused" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.203995 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.176:8080/healthcheck\": dial tcp 10.217.0.176:8080: connect: connection refused" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.205816 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.208009 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.208059 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="62814b8d-8679-4350-be7d-5f729f901846" containerName="nova-cell1-conductor-conductor" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.261780 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8782e985-ff23-4580-bdfb-ef2dd9b540bc-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.261812 4919 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bff1404-f9b1-48f8-b093-95c3bb206c6a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.261821 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.261892 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.261946 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:49.261929517 +0000 UTC m=+1536.503810125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-scripts" not found Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.262253 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.262279 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:49.262270876 +0000 UTC m=+1536.504151484 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-config" not found Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.359042 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.359123 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.360494 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.360575 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.361661 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.361693 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.362108 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.362132 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.364904 4919 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.364985 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:47.364966329 +0000 UTC m=+1534.606846937 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-httpd-config" not found Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.365517 4919 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.365651 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:47.365634437 +0000 UTC m=+1534.607515055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-config" not found Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.389781 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.442857 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-7bp9g"] Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.464725 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 22:15:45 crc kubenswrapper[4919]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: if [ -n "nova_cell1" ]; then Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="nova_cell1" Mar 10 22:15:45 crc kubenswrapper[4919]: else Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="*" Mar 10 22:15:45 crc kubenswrapper[4919]: fi Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: # going for maximum compatibility here: Mar 10 22:15:45 crc kubenswrapper[4919]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 22:15:45 crc kubenswrapper[4919]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 22:15:45 crc kubenswrapper[4919]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 22:15:45 crc kubenswrapper[4919]: # support updates Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: $MYSQL_CMD < logger="UnhandledError" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.467434 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" podUID="cdcceccb-6413-4cf8-972e-7744b99c626e" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.520252 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8d8a3d-169b-4fea-9848-b8998625b1d2" path="/var/lib/kubelet/pods/0a8d8a3d-169b-4fea-9848-b8998625b1d2/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.520972 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c190e44-b111-4a65-9700-d0255aa11800" path="/var/lib/kubelet/pods/0c190e44-b111-4a65-9700-d0255aa11800/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.521623 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bad7d2b-c98f-49e7-86a8-3467f75830f2" path="/var/lib/kubelet/pods/1bad7d2b-c98f-49e7-86a8-3467f75830f2/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.522223 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4512eb0d-2445-4ab8-833d-80f0500243b6" path="/var/lib/kubelet/pods/4512eb0d-2445-4ab8-833d-80f0500243b6/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.523722 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bff1404-f9b1-48f8-b093-95c3bb206c6a" path="/var/lib/kubelet/pods/6bff1404-f9b1-48f8-b093-95c3bb206c6a/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.524507 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765c20cf-cede-45c6-867e-c3fa0749238d" path="/var/lib/kubelet/pods/765c20cf-cede-45c6-867e-c3fa0749238d/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.525141 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88" path="/var/lib/kubelet/pods/c25ebb1f-15ad-48b3-b7a4-7bdb1fd40b88/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.526461 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d891cb6e-7d23-40d0-9fd4-28ab980f207c" path="/var/lib/kubelet/pods/d891cb6e-7d23-40d0-9fd4-28ab980f207c/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.527908 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da024d80-ca57-41a6-b46a-508015462b2d" path="/var/lib/kubelet/pods/da024d80-ca57-41a6-b46a-508015462b2d/volumes" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.550479 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 22:15:45 crc kubenswrapper[4919]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: if [ -n "nova_api" ]; then Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="nova_api" Mar 10 22:15:45 crc kubenswrapper[4919]: else Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="*" Mar 10 22:15:45 crc kubenswrapper[4919]: fi Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: # going for maximum compatibility here: Mar 10 22:15:45 crc kubenswrapper[4919]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 22:15:45 crc kubenswrapper[4919]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 22:15:45 crc kubenswrapper[4919]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 22:15:45 crc kubenswrapper[4919]: # support updates Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: $MYSQL_CMD < logger="UnhandledError" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.553312 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-cdcf-account-create-update-qh8rx" podUID="9d5850b9-d946-4b1a-9171-718243c78596" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.565578 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 22:15:45 crc kubenswrapper[4919]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: if [ -n "nova_cell0" ]; then Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="nova_cell0" Mar 10 22:15:45 crc kubenswrapper[4919]: else Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="*" Mar 10 22:15:45 crc kubenswrapper[4919]: fi Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: # going for maximum compatibility here: Mar 10 22:15:45 crc kubenswrapper[4919]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 22:15:45 crc kubenswrapper[4919]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 22:15:45 crc kubenswrapper[4919]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 22:15:45 crc kubenswrapper[4919]: # support updates Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: $MYSQL_CMD < logger="UnhandledError" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.566778 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" podUID="a946243a-c6de-4499-9c3c-c11073d02f8e" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.569632 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-nova-novncproxy-tls-certs\") pod \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.569682 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-config-data\") pod \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.569731 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv75l\" (UniqueName: \"kubernetes.io/projected/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-kube-api-access-wv75l\") pod \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.569814 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-vencrypt-tls-certs\") pod \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.569861 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-combined-ca-bundle\") pod \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.576046 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-kube-api-access-wv75l" (OuterVolumeSpecName: "kube-api-access-wv75l") pod "aff6348f-a0cf-4b67-a072-edcde9dcb3c4" (UID: "aff6348f-a0cf-4b67-a072-edcde9dcb3c4"). InnerVolumeSpecName "kube-api-access-wv75l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.616973 4919 generic.go:334] "Generic (PLEG): container finished" podID="aff6348f-a0cf-4b67-a072-edcde9dcb3c4" containerID="d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc" exitCode=0 Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.617325 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.628320 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wgnf2_e61256d0-9ca6-4524-a19d-7efd32ab9724/openstack-network-exporter/0.log" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.628607 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wgnf2" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.657619 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 22:15:45 crc kubenswrapper[4919]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: if [ -n "neutron" ]; then Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="neutron" Mar 10 22:15:45 crc kubenswrapper[4919]: else Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="*" Mar 10 22:15:45 crc kubenswrapper[4919]: fi Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: # going for maximum compatibility here: Mar 10 22:15:45 crc kubenswrapper[4919]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 22:15:45 crc kubenswrapper[4919]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 22:15:45 crc kubenswrapper[4919]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 22:15:45 crc kubenswrapper[4919]: # support updates Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: $MYSQL_CMD < logger="UnhandledError" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.657920 4919 generic.go:334] "Generic (PLEG): container finished" podID="37ef9179-69db-49ab-a4e6-2e2b815fc260" containerID="3a27cd1d1040a8d26e69883aaeb7f8132c3852f88812a287ec745156fd408f80" exitCode=0 Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.659180 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-45c6-account-create-update-zczhm" podUID="1c34af84-e5e2-4219-b7b2-bf1e2c2a731b" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.676677 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "aff6348f-a0cf-4b67-a072-edcde9dcb3c4" (UID: "aff6348f-a0cf-4b67-a072-edcde9dcb3c4"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.680408 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-vencrypt-tls-certs\") pod \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\" (UID: \"aff6348f-a0cf-4b67-a072-edcde9dcb3c4\") " Mar 10 22:15:45 crc kubenswrapper[4919]: W0310 22:15:45.683435 4919 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/aff6348f-a0cf-4b67-a072-edcde9dcb3c4/volumes/kubernetes.io~secret/vencrypt-tls-certs Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.683463 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "aff6348f-a0cf-4b67-a072-edcde9dcb3c4" (UID: "aff6348f-a0cf-4b67-a072-edcde9dcb3c4"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686756 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" event={"ID":"a946243a-c6de-4499-9c3c-c11073d02f8e","Type":"ContainerStarted","Data":"500d43b92e366140a950293fc3306004e89295d8761ca2fbdf0c5d6c84dc907e"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686797 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cdcf-account-create-update-qh8rx" event={"ID":"9d5850b9-d946-4b1a-9171-718243c78596","Type":"ContainerStarted","Data":"15b9ed1342d63b43062e26908b90d75f1071fec02128f34a156bebf7ab641c39"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686810 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aff6348f-a0cf-4b67-a072-edcde9dcb3c4","Type":"ContainerDied","Data":"d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686826 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-qh8rx"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686840 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aff6348f-a0cf-4b67-a072-edcde9dcb3c4","Type":"ContainerDied","Data":"6550cd34ff7b7082c837f1cb6d134519d4a94b56191ff750893f0bcc73b52c00"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686849 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-8l4wr"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686859 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wgnf2" event={"ID":"e61256d0-9ca6-4524-a19d-7efd32ab9724","Type":"ContainerDied","Data":"9c5bddf690b8b0b70c899f52c552126c69a3e2fab725a171603e151e6b5627c5"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686870 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-45c6-account-create-update-zczhm"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686879 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37ef9179-69db-49ab-a4e6-2e2b815fc260","Type":"ContainerDied","Data":"3a27cd1d1040a8d26e69883aaeb7f8132c3852f88812a287ec745156fd408f80"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.686897 4919 scope.go:117] "RemoveContainer" containerID="d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.691004 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv75l\" (UniqueName: \"kubernetes.io/projected/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-kube-api-access-wv75l\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.691071 4919 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.695851 4919 generic.go:334] "Generic (PLEG): container finished" podID="d0f89c3b-5242-409b-a318-5b69410e9680" containerID="704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5" exitCode=0 Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.695938 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" event={"ID":"d0f89c3b-5242-409b-a318-5b69410e9680","Type":"ContainerDied","Data":"704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.703269 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aff6348f-a0cf-4b67-a072-edcde9dcb3c4" (UID: "aff6348f-a0cf-4b67-a072-edcde9dcb3c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.703997 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 22:15:45 crc kubenswrapper[4919]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: if [ -n "glance" ]; then Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="glance" Mar 10 22:15:45 crc kubenswrapper[4919]: else Mar 10 22:15:45 crc kubenswrapper[4919]: GRANT_DATABASE="*" Mar 10 22:15:45 crc kubenswrapper[4919]: fi Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: # going for maximum compatibility here: Mar 10 22:15:45 crc kubenswrapper[4919]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 22:15:45 crc kubenswrapper[4919]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 22:15:45 crc kubenswrapper[4919]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 22:15:45 crc kubenswrapper[4919]: # support updates Mar 10 22:15:45 crc kubenswrapper[4919]: Mar 10 22:15:45 crc kubenswrapper[4919]: $MYSQL_CMD < logger="UnhandledError" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.705242 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-9e8c-account-create-update-6q4j2" podUID="a01f4397-9fee-4ff4-af76-ed0b37f04b28" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.715460 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9e8c-account-create-update-6q4j2"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.720942 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-config-data" (OuterVolumeSpecName: "config-data") pod "aff6348f-a0cf-4b67-a072-edcde9dcb3c4" (UID: "aff6348f-a0cf-4b67-a072-edcde9dcb3c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.741902 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "aff6348f-a0cf-4b67-a072-edcde9dcb3c4" (UID: "aff6348f-a0cf-4b67-a072-edcde9dcb3c4"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.743505 4919 generic.go:334] "Generic (PLEG): container finished" podID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" containerID="090cce7d6d9affd1cfe7890b42642492b8cbefcf21721a4ef0b83c1e4db42a0c" exitCode=1 Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.743566 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xlzz4" event={"ID":"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625","Type":"ContainerDied","Data":"090cce7d6d9affd1cfe7890b42642492b8cbefcf21721a4ef0b83c1e4db42a0c"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.743592 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xlzz4" event={"ID":"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625","Type":"ContainerStarted","Data":"2d86ae89ca9eacf088e2483fdc8b7fc27d54e8d4395301cbc4733433a8296bf1"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.744095 4919 scope.go:117] "RemoveContainer" containerID="090cce7d6d9affd1cfe7890b42642492b8cbefcf21721a4ef0b83c1e4db42a0c" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.753859 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.762263 4919 generic.go:334] "Generic (PLEG): container finished" podID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerID="9cdb7599c01cdc95ab93aaa9cd850cf9d1c5bc23e81939b0310b3ed9a9214bc6" exitCode=0 Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.762370 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846dbc6cd5-kg4kx" event={"ID":"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d","Type":"ContainerDied","Data":"9cdb7599c01cdc95ab93aaa9cd850cf9d1c5bc23e81939b0310b3ed9a9214bc6"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.769297 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8782e985-ff23-4580-bdfb-ef2dd9b540bc/ovsdbserver-sb/0.log" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.769437 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8782e985-ff23-4580-bdfb-ef2dd9b540bc","Type":"ContainerDied","Data":"499235ca0c8240a2e542faae80a04221ced8eed0d502e7f5ba10a70192104c54"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.769721 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.775525 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8e40-account-create-update-kkddk" event={"ID":"ec49f65c-e8af-44a1-b464-af2a86b299fc","Type":"ContainerStarted","Data":"dd5815edd58117899f7992441a905b99e59e8c8dfff238ce799f6fdbe61eeec9"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.792714 4919 generic.go:334] "Generic (PLEG): container finished" podID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerID="3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846" exitCode=0 Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.794275 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145","Type":"ContainerDied","Data":"3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.798305 4919 generic.go:334] "Generic (PLEG): container finished" podID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerID="46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5" exitCode=0 Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.798374 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68449cb44c-wmmzf" event={"ID":"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0","Type":"ContainerDied","Data":"46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.798429 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68449cb44c-wmmzf" event={"ID":"7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0","Type":"ContainerDied","Data":"86bd79f50ff0d5fef2fbc0b8bd95d50c28ad34e052fd53736b71a374fa81dc64"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.798489 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68449cb44c-wmmzf" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.800268 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.800373 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.800512 4919 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff6348f-a0cf-4b67-a072-edcde9dcb3c4-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.802730 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" event={"ID":"cdcceccb-6413-4cf8-972e-7744b99c626e","Type":"ContainerStarted","Data":"1e9ea45ecb3be44973497c53f355da1f9cf805220e00fe1ac761a45cf7f731d6"} Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.802876 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd56c4d5-nbpbz" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.803049 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.836755 4919 scope.go:117] "RemoveContainer" containerID="d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc" Mar 10 22:15:45 crc kubenswrapper[4919]: E0310 22:15:45.840030 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc\": container with ID starting with d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc not found: ID does not exist" containerID="d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.840071 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc"} err="failed to get container status \"d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc\": rpc error: code = NotFound desc = could not find container \"d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc\": container with ID starting with d56c4da656d1a27e68d9c323923f33760324c2a4c5d3e0641f9e0a743d95e2fc not found: ID does not exist" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.840097 4919 scope.go:117] "RemoveContainer" containerID="1bb99faa0c9dbd8195614221f50fa8ce1965d14085b52684adacebddfa8030f0" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.907976 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-wgnf2"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.915325 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-wgnf2"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.917946 4919 scope.go:117] "RemoveContainer" containerID="9356369f3992f46c3635f399cef0e5e3d0db351590dc5b463547b642e4ea9e30" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.945438 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-68449cb44c-wmmzf"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.959486 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.970464 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-68449cb44c-wmmzf"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.979143 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 22:15:45 crc kubenswrapper[4919]: I0310 22:15:45.987482 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.004071 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.005147 4919 scope.go:117] "RemoveContainer" containerID="fa482cd6ab218644c93efa05d2493a56acc39a947142b9aa93cab897fbd8faee" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.025334 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.039464 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-nbpbz"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.048070 4919 scope.go:117] "RemoveContainer" containerID="27c978f4a20203cc94acb177ddbfd66f76de9f4ddcde70ca6c04fb027e8101f9" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.048197 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9dd56c4d5-nbpbz"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.059911 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.063310 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.072120 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.094092 4919 scope.go:117] "RemoveContainer" containerID="46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.106526 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"37ef9179-69db-49ab-a4e6-2e2b815fc260\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.106638 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-generated\") pod \"37ef9179-69db-49ab-a4e6-2e2b815fc260\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.106660 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-kolla-config\") pod \"37ef9179-69db-49ab-a4e6-2e2b815fc260\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.106713 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-default\") pod \"37ef9179-69db-49ab-a4e6-2e2b815fc260\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.106728 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-operator-scripts\") pod \"37ef9179-69db-49ab-a4e6-2e2b815fc260\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.106772 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-combined-ca-bundle\") pod \"37ef9179-69db-49ab-a4e6-2e2b815fc260\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.106825 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-galera-tls-certs\") pod \"37ef9179-69db-49ab-a4e6-2e2b815fc260\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.106852 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c85b9\" (UniqueName: \"kubernetes.io/projected/37ef9179-69db-49ab-a4e6-2e2b815fc260-kube-api-access-c85b9\") pod \"37ef9179-69db-49ab-a4e6-2e2b815fc260\" (UID: \"37ef9179-69db-49ab-a4e6-2e2b815fc260\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.108457 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "37ef9179-69db-49ab-a4e6-2e2b815fc260" (UID: "37ef9179-69db-49ab-a4e6-2e2b815fc260"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.109047 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "37ef9179-69db-49ab-a4e6-2e2b815fc260" (UID: "37ef9179-69db-49ab-a4e6-2e2b815fc260"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.109132 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "37ef9179-69db-49ab-a4e6-2e2b815fc260" (UID: "37ef9179-69db-49ab-a4e6-2e2b815fc260"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.109278 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37ef9179-69db-49ab-a4e6-2e2b815fc260" (UID: "37ef9179-69db-49ab-a4e6-2e2b815fc260"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.118013 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ef9179-69db-49ab-a4e6-2e2b815fc260-kube-api-access-c85b9" (OuterVolumeSpecName: "kube-api-access-c85b9") pod "37ef9179-69db-49ab-a4e6-2e2b815fc260" (UID: "37ef9179-69db-49ab-a4e6-2e2b815fc260"). InnerVolumeSpecName "kube-api-access-c85b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.139842 4919 scope.go:117] "RemoveContainer" containerID="3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.147447 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "37ef9179-69db-49ab-a4e6-2e2b815fc260" (UID: "37ef9179-69db-49ab-a4e6-2e2b815fc260"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.173502 4919 scope.go:117] "RemoveContainer" containerID="46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.174492 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5\": container with ID starting with 46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5 not found: ID does not exist" containerID="46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.174516 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5"} err="failed to get container status \"46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5\": rpc error: code = NotFound desc = could not find container \"46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5\": container with ID starting with 46d7c27293d48d5ebbae63a3af00729e46f409eb8eff14da8312c3010d4b3df5 not found: ID does not exist" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.174537 4919 scope.go:117] "RemoveContainer" containerID="3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.176699 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd\": container with ID starting with 3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd not found: ID does not exist" containerID="3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.176722 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd"} err="failed to get container status \"3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd\": rpc error: code = NotFound desc = could not find container \"3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd\": container with ID starting with 3a14995c1656c2836f0b87135f426135188e582e6956faccde8a4adfc21e5fcd not found: ID does not exist" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.207455 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37ef9179-69db-49ab-a4e6-2e2b815fc260" (UID: "37ef9179-69db-49ab-a4e6-2e2b815fc260"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.208969 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a946243a-c6de-4499-9c3c-c11073d02f8e-operator-scripts\") pod \"a946243a-c6de-4499-9c3c-c11073d02f8e\" (UID: \"a946243a-c6de-4499-9c3c-c11073d02f8e\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.209134 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwxxg\" (UniqueName: \"kubernetes.io/projected/a946243a-c6de-4499-9c3c-c11073d02f8e-kube-api-access-qwxxg\") pod \"a946243a-c6de-4499-9c3c-c11073d02f8e\" (UID: \"a946243a-c6de-4499-9c3c-c11073d02f8e\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.209941 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a946243a-c6de-4499-9c3c-c11073d02f8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a946243a-c6de-4499-9c3c-c11073d02f8e" (UID: "a946243a-c6de-4499-9c3c-c11073d02f8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.210312 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.210329 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.210341 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.210352 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c85b9\" (UniqueName: \"kubernetes.io/projected/37ef9179-69db-49ab-a4e6-2e2b815fc260-kube-api-access-c85b9\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.210363 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a946243a-c6de-4499-9c3c-c11073d02f8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.210384 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.210456 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37ef9179-69db-49ab-a4e6-2e2b815fc260-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.210467 4919 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37ef9179-69db-49ab-a4e6-2e2b815fc260-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.217239 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a946243a-c6de-4499-9c3c-c11073d02f8e-kube-api-access-qwxxg" (OuterVolumeSpecName: "kube-api-access-qwxxg") pod "a946243a-c6de-4499-9c3c-c11073d02f8e" (UID: "a946243a-c6de-4499-9c3c-c11073d02f8e"). InnerVolumeSpecName "kube-api-access-qwxxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.234416 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.237347 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.277716 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "37ef9179-69db-49ab-a4e6-2e2b815fc260" (UID: "37ef9179-69db-49ab-a4e6-2e2b815fc260"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.312792 4919 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ef9179-69db-49ab-a4e6-2e2b815fc260-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.312827 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.312837 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwxxg\" (UniqueName: \"kubernetes.io/projected/a946243a-c6de-4499-9c3c-c11073d02f8e-kube-api-access-qwxxg\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.414223 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76m5t\" (UniqueName: \"kubernetes.io/projected/9d5850b9-d946-4b1a-9171-718243c78596-kube-api-access-76m5t\") pod \"9d5850b9-d946-4b1a-9171-718243c78596\" (UID: \"9d5850b9-d946-4b1a-9171-718243c78596\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.414582 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5850b9-d946-4b1a-9171-718243c78596-operator-scripts\") pod \"9d5850b9-d946-4b1a-9171-718243c78596\" (UID: \"9d5850b9-d946-4b1a-9171-718243c78596\") " Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.415134 4919 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.415195 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data podName:3fe05756-9202-4514-8eea-0c786a2b6d56 nodeName:}" failed. No retries permitted until 2026-03-10 22:15:50.415178434 +0000 UTC m=+1537.657059042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data") pod "rabbitmq-cell1-server-0" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56") : configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.416663 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5850b9-d946-4b1a-9171-718243c78596-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d5850b9-d946-4b1a-9171-718243c78596" (UID: "9d5850b9-d946-4b1a-9171-718243c78596"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.421825 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5850b9-d946-4b1a-9171-718243c78596-kube-api-access-76m5t" (OuterVolumeSpecName: "kube-api-access-76m5t") pod "9d5850b9-d946-4b1a-9171-718243c78596" (UID: "9d5850b9-d946-4b1a-9171-718243c78596"). InnerVolumeSpecName "kube-api-access-76m5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.518658 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d5850b9-d946-4b1a-9171-718243c78596-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.518687 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76m5t\" (UniqueName: \"kubernetes.io/projected/9d5850b9-d946-4b1a-9171-718243c78596-kube-api-access-76m5t\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.554845 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.556795 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="sg-core" containerID="cri-o://c84f2ef693d15394b292c9104df5d25964c0ba501f0d8a5f3f3a4710e2c7b5e1" gracePeriod=30 Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.557030 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="proxy-httpd" containerID="cri-o://8dd9c6db1ef3f3090c173c377cd48fad2c1b903961ef2c219973ba650ae92aa3" gracePeriod=30 Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.557454 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="ceilometer-notification-agent" containerID="cri-o://4f27c36666ba7ecf2d24cedee59efe2a08c7b4e6c86f4fe4198918504c6bf578" gracePeriod=30 Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.559668 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="ceilometer-central-agent" containerID="cri-o://69fdc3c8e2ab199f6bb93d9e3a1a78edfb949241e41f07473825a631528f1dde" gracePeriod=30 Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.609199 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.609874 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="7bfe342f-267a-4239-a9cc-8df0e3d14a92" containerName="kube-state-metrics" containerID="cri-o://22b51ffd48cd3ea852a7427f1dfc881e19ae51a0a5028680360ef0179dcc54e1" gracePeriod=30 Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.812671 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.813198 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="a4a88061-cba8-4535-bf01-5285d8cbb79f" containerName="memcached" containerID="cri-o://fb28d2d8eb9c98873f08b6f1830499a051d20df33a610f4a9e6624fa224475b0" gracePeriod=30 Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.829544 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" event={"ID":"a946243a-c6de-4499-9c3c-c11073d02f8e","Type":"ContainerDied","Data":"500d43b92e366140a950293fc3306004e89295d8761ca2fbdf0c5d6c84dc907e"} Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.829667 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dff6-account-create-update-8l4wr" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.874314 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bbb7-account-create-update-tn746"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.874754 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" event={"ID":"cdcceccb-6413-4cf8-972e-7744b99c626e","Type":"ContainerDied","Data":"1e9ea45ecb3be44973497c53f355da1f9cf805220e00fe1ac761a45cf7f731d6"} Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.874790 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e9ea45ecb3be44973497c53f355da1f9cf805220e00fe1ac761a45cf7f731d6" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.904999 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.914480 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bbb7-account-create-update-pmxgp"] Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.914918 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerName="barbican-worker-log" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.914940 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerName="barbican-worker-log" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.914959 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="ovsdbserver-sb" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.914969 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="ovsdbserver-sb" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.914978 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.914985 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915002 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="ovsdbserver-nb" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915010 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="ovsdbserver-nb" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915020 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dac1c8-963f-466f-977e-37b2fd98d32c" containerName="init" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915026 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dac1c8-963f-466f-977e-37b2fd98d32c" containerName="init" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915039 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dac1c8-963f-466f-977e-37b2fd98d32c" containerName="dnsmasq-dns" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915047 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dac1c8-963f-466f-977e-37b2fd98d32c" containerName="dnsmasq-dns" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915056 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef9179-69db-49ab-a4e6-2e2b815fc260" containerName="galera" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915063 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef9179-69db-49ab-a4e6-2e2b815fc260" containerName="galera" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915079 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915087 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915104 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff6348f-a0cf-4b67-a072-edcde9dcb3c4" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915112 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff6348f-a0cf-4b67-a072-edcde9dcb3c4" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915125 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerName="barbican-worker" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915133 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerName="barbican-worker" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915142 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef9179-69db-49ab-a4e6-2e2b815fc260" containerName="mysql-bootstrap" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915149 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef9179-69db-49ab-a4e6-2e2b815fc260" containerName="mysql-bootstrap" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.915169 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61256d0-9ca6-4524-a19d-7efd32ab9724" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915192 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61256d0-9ca6-4524-a19d-7efd32ab9724" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915423 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="ovsdbserver-nb" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915449 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ef9179-69db-49ab-a4e6-2e2b815fc260" containerName="galera" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915463 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff6348f-a0cf-4b67-a072-edcde9dcb3c4" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915475 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61256d0-9ca6-4524-a19d-7efd32ab9724" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915485 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="ovsdbserver-sb" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915499 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915512 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" containerName="openstack-network-exporter" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915521 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="37dac1c8-963f-466f-977e-37b2fd98d32c" containerName="dnsmasq-dns" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915533 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerName="barbican-worker-log" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.915544 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" containerName="barbican-worker" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.921069 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.930719 4919 generic.go:334] "Generic (PLEG): container finished" podID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" containerID="3f184ab546adc93a6838bca186ee1a27fb82249fb7be1e6ac56b84aa0dcb13c5" exitCode=1 Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.930791 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xlzz4" event={"ID":"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625","Type":"ContainerDied","Data":"3f184ab546adc93a6838bca186ee1a27fb82249fb7be1e6ac56b84aa0dcb13c5"} Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.930830 4919 scope.go:117] "RemoveContainer" containerID="090cce7d6d9affd1cfe7890b42642492b8cbefcf21721a4ef0b83c1e4db42a0c" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.931564 4919 scope.go:117] "RemoveContainer" containerID="3f184ab546adc93a6838bca186ee1a27fb82249fb7be1e6ac56b84aa0dcb13c5" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.931988 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-xlzz4_openstack(76a514a0-0d4c-4f6b-8ba7-cd5b4834d625)\"" pod="openstack/root-account-create-update-xlzz4" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.932596 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.937614 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.947680 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957484 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-internal-tls-certs\") pod \"d0f89c3b-5242-409b-a318-5b69410e9680\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957531 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdcceccb-6413-4cf8-972e-7744b99c626e-operator-scripts\") pod \"cdcceccb-6413-4cf8-972e-7744b99c626e\" (UID: \"cdcceccb-6413-4cf8-972e-7744b99c626e\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957578 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-public-tls-certs\") pod \"d0f89c3b-5242-409b-a318-5b69410e9680\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957610 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rtgk\" (UniqueName: \"kubernetes.io/projected/cdcceccb-6413-4cf8-972e-7744b99c626e-kube-api-access-5rtgk\") pod \"cdcceccb-6413-4cf8-972e-7744b99c626e\" (UID: \"cdcceccb-6413-4cf8-972e-7744b99c626e\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957634 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec49f65c-e8af-44a1-b464-af2a86b299fc-operator-scripts\") pod \"ec49f65c-e8af-44a1-b464-af2a86b299fc\" (UID: \"ec49f65c-e8af-44a1-b464-af2a86b299fc\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957655 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-log-httpd\") pod \"d0f89c3b-5242-409b-a318-5b69410e9680\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957690 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-config-data\") pod \"d0f89c3b-5242-409b-a318-5b69410e9680\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957784 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-combined-ca-bundle\") pod \"d0f89c3b-5242-409b-a318-5b69410e9680\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957817 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whlhh\" (UniqueName: \"kubernetes.io/projected/ec49f65c-e8af-44a1-b464-af2a86b299fc-kube-api-access-whlhh\") pod \"ec49f65c-e8af-44a1-b464-af2a86b299fc\" (UID: \"ec49f65c-e8af-44a1-b464-af2a86b299fc\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957833 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-run-httpd\") pod \"d0f89c3b-5242-409b-a318-5b69410e9680\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957855 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-etc-swift\") pod \"d0f89c3b-5242-409b-a318-5b69410e9680\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.957896 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4mrb\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-kube-api-access-b4mrb\") pod \"d0f89c3b-5242-409b-a318-5b69410e9680\" (UID: \"d0f89c3b-5242-409b-a318-5b69410e9680\") " Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.958208 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jln88\" (UniqueName: \"kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88\") pod \"keystone-bbb7-account-create-update-pmxgp\" (UID: \"008d3aa2-636e-48bc-a09a-00541bc3bd5e\") " pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.958311 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts\") pod \"keystone-bbb7-account-create-update-pmxgp\" (UID: \"008d3aa2-636e-48bc-a09a-00541bc3bd5e\") " pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.963963 4919 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 22:15:46 crc kubenswrapper[4919]: E0310 22:15:46.964040 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts podName:76a514a0-0d4c-4f6b-8ba7-cd5b4834d625 nodeName:}" failed. No retries permitted until 2026-03-10 22:15:47.464020864 +0000 UTC m=+1534.705901482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts") pod "root-account-create-update-xlzz4" (UID: "76a514a0-0d4c-4f6b-8ba7-cd5b4834d625") : configmap "openstack-scripts" not found Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.964363 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d0f89c3b-5242-409b-a318-5b69410e9680" (UID: "d0f89c3b-5242-409b-a318-5b69410e9680"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.968346 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bbb7-account-create-update-tn746"] Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.968924 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdcceccb-6413-4cf8-972e-7744b99c626e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdcceccb-6413-4cf8-972e-7744b99c626e" (UID: "cdcceccb-6413-4cf8-972e-7744b99c626e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.969585 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d0f89c3b-5242-409b-a318-5b69410e9680" (UID: "d0f89c3b-5242-409b-a318-5b69410e9680"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.969993 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec49f65c-e8af-44a1-b464-af2a86b299fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec49f65c-e8af-44a1-b464-af2a86b299fc" (UID: "ec49f65c-e8af-44a1-b464-af2a86b299fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.977267 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37ef9179-69db-49ab-a4e6-2e2b815fc260","Type":"ContainerDied","Data":"5526a0e2dbd7db7dba05ceea6c12956ec629b63fa1bcbee8601c93257949fe8b"} Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.977358 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 22:15:46 crc kubenswrapper[4919]: I0310 22:15:46.981085 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec49f65c-e8af-44a1-b464-af2a86b299fc-kube-api-access-whlhh" (OuterVolumeSpecName: "kube-api-access-whlhh") pod "ec49f65c-e8af-44a1-b464-af2a86b299fc" (UID: "ec49f65c-e8af-44a1-b464-af2a86b299fc"). InnerVolumeSpecName "kube-api-access-whlhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.010409 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-45c6-account-create-update-zczhm" event={"ID":"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b","Type":"ContainerStarted","Data":"f277710fbf5ffb809d7f8e3ce03b09ed5a04584573c19e7fcd445716cb202ab0"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.014263 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdcceccb-6413-4cf8-972e-7744b99c626e-kube-api-access-5rtgk" (OuterVolumeSpecName: "kube-api-access-5rtgk") pod "cdcceccb-6413-4cf8-972e-7744b99c626e" (UID: "cdcceccb-6413-4cf8-972e-7744b99c626e"). InnerVolumeSpecName "kube-api-access-5rtgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.032059 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e8c-account-create-update-6q4j2" event={"ID":"a01f4397-9fee-4ff4-af76-ed0b37f04b28","Type":"ContainerStarted","Data":"1744a839035086036e24ae77885b6783bbfb67bf0d0fade6f10eed98757e44fe"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.038328 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d0f89c3b-5242-409b-a318-5b69410e9680" (UID: "d0f89c3b-5242-409b-a318-5b69410e9680"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.045280 4919 scope.go:117] "RemoveContainer" containerID="3a27cd1d1040a8d26e69883aaeb7f8132c3852f88812a287ec745156fd408f80" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.045727 4919 generic.go:334] "Generic (PLEG): container finished" podID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerID="d88c0958bf40600808f9977f231a5fa1e34419ab9909560a692746f53c31f4f0" exitCode=0 Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.045800 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854d8d6bf4-kknjq" event={"ID":"981bb03c-23be-4bf8-a9f6-cb8a552f66a5","Type":"ContainerDied","Data":"d88c0958bf40600808f9977f231a5fa1e34419ab9909560a692746f53c31f4f0"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.056516 4919 generic.go:334] "Generic (PLEG): container finished" podID="1f324194-64d5-4755-847b-f554b94e652c" containerID="8dd9c6db1ef3f3090c173c377cd48fad2c1b903961ef2c219973ba650ae92aa3" exitCode=0 Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.056559 4919 generic.go:334] "Generic (PLEG): container finished" podID="1f324194-64d5-4755-847b-f554b94e652c" containerID="c84f2ef693d15394b292c9104df5d25964c0ba501f0d8a5f3f3a4710e2c7b5e1" exitCode=2 Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.056607 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerDied","Data":"8dd9c6db1ef3f3090c173c377cd48fad2c1b903961ef2c219973ba650ae92aa3"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.056637 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerDied","Data":"c84f2ef693d15394b292c9104df5d25964c0ba501f0d8a5f3f3a4710e2c7b5e1"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.058011 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-kube-api-access-b4mrb" (OuterVolumeSpecName: "kube-api-access-b4mrb") pod "d0f89c3b-5242-409b-a318-5b69410e9680" (UID: "d0f89c3b-5242-409b-a318-5b69410e9680"). InnerVolumeSpecName "kube-api-access-b4mrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.058079 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bbb7-account-create-update-pmxgp"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.060135 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cdcf-account-create-update-qh8rx" event={"ID":"9d5850b9-d946-4b1a-9171-718243c78596","Type":"ContainerDied","Data":"15b9ed1342d63b43062e26908b90d75f1071fec02128f34a156bebf7ab641c39"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.060277 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cdcf-account-create-update-qh8rx" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062256 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jln88\" (UniqueName: \"kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88\") pod \"keystone-bbb7-account-create-update-pmxgp\" (UID: \"008d3aa2-636e-48bc-a09a-00541bc3bd5e\") " pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062422 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts\") pod \"keystone-bbb7-account-create-update-pmxgp\" (UID: \"008d3aa2-636e-48bc-a09a-00541bc3bd5e\") " pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062504 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4mrb\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-kube-api-access-b4mrb\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062520 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdcceccb-6413-4cf8-972e-7744b99c626e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062532 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rtgk\" (UniqueName: \"kubernetes.io/projected/cdcceccb-6413-4cf8-972e-7744b99c626e-kube-api-access-5rtgk\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062543 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec49f65c-e8af-44a1-b464-af2a86b299fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062553 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062564 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whlhh\" (UniqueName: \"kubernetes.io/projected/ec49f65c-e8af-44a1-b464-af2a86b299fc-kube-api-access-whlhh\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062578 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f89c3b-5242-409b-a318-5b69410e9680-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.062588 4919 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d0f89c3b-5242-409b-a318-5b69410e9680-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.062652 4919 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.062707 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts podName:008d3aa2-636e-48bc-a09a-00541bc3bd5e nodeName:}" failed. No retries permitted until 2026-03-10 22:15:47.562687356 +0000 UTC m=+1534.804567974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts") pod "keystone-bbb7-account-create-update-pmxgp" (UID: "008d3aa2-636e-48bc-a09a-00541bc3bd5e") : configmap "openstack-scripts" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.078427 4919 projected.go:194] Error preparing data for projected volume kube-api-access-jln88 for pod openstack/keystone-bbb7-account-create-update-pmxgp: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.078494 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88 podName:008d3aa2-636e-48bc-a09a-00541bc3bd5e nodeName:}" failed. No retries permitted until 2026-03-10 22:15:47.578474294 +0000 UTC m=+1534.820354902 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jln88" (UniqueName: "kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88") pod "keystone-bbb7-account-create-update-pmxgp" (UID: "008d3aa2-636e-48bc-a09a-00541bc3bd5e") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.078789 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8e40-account-create-update-kkddk" event={"ID":"ec49f65c-e8af-44a1-b464-af2a86b299fc","Type":"ContainerDied","Data":"dd5815edd58117899f7992441a905b99e59e8c8dfff238ce799f6fdbe61eeec9"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.078869 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e40-account-create-update-kkddk" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.081069 4919 generic.go:334] "Generic (PLEG): container finished" podID="7bfe342f-267a-4239-a9cc-8df0e3d14a92" containerID="22b51ffd48cd3ea852a7427f1dfc881e19ae51a0a5028680360ef0179dcc54e1" exitCode=2 Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.081123 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7bfe342f-267a-4239-a9cc-8df0e3d14a92","Type":"ContainerDied","Data":"22b51ffd48cd3ea852a7427f1dfc881e19ae51a0a5028680360ef0179dcc54e1"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.098807 4919 generic.go:334] "Generic (PLEG): container finished" podID="d0f89c3b-5242-409b-a318-5b69410e9680" containerID="64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3" exitCode=0 Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.098868 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" event={"ID":"d0f89c3b-5242-409b-a318-5b69410e9680","Type":"ContainerDied","Data":"64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.098898 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" event={"ID":"d0f89c3b-5242-409b-a318-5b69410e9680","Type":"ContainerDied","Data":"e153a03c272aa21dca94c411ddadd23c66e624d1242061994242ef8efc636065"} Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.098964 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.099496 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ctpbj"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.122052 4919 scope.go:117] "RemoveContainer" containerID="8c17bba3a2afe23e23c08f322962c8fa5f82fede04d8db1307f9c7ffd15139b9" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.143456 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ctpbj"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.144266 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0f89c3b-5242-409b-a318-5b69410e9680" (UID: "d0f89c3b-5242-409b-a318-5b69410e9680"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.168727 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.178931 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k96hb"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.182032 4919 scope.go:117] "RemoveContainer" containerID="64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.201170 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d0f89c3b-5242-409b-a318-5b69410e9680" (UID: "d0f89c3b-5242-409b-a318-5b69410e9680"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.226876 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k96hb"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.235438 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-config-data" (OuterVolumeSpecName: "config-data") pod "d0f89c3b-5242-409b-a318-5b69410e9680" (UID: "d0f89c3b-5242-409b-a318-5b69410e9680"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.247620 4919 scope.go:117] "RemoveContainer" containerID="704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.252780 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d0f89c3b-5242-409b-a318-5b69410e9680" (UID: "d0f89c3b-5242-409b-a318-5b69410e9680"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.270802 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.270846 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.270857 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f89c3b-5242-409b-a318-5b69410e9680-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.299475 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-599f4d795-pgnpd"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.299768 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-599f4d795-pgnpd" podUID="408722a8-2c8a-4bda-82d5-1d2f58bda7d7" containerName="keystone-api" containerID="cri-o://d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7" gracePeriod=30 Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.304608 4919 scope.go:117] "RemoveContainer" containerID="64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3" Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.305648 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3\": container with ID starting with 64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3 not found: ID does not exist" containerID="64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.305692 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3"} err="failed to get container status \"64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3\": rpc error: code = NotFound desc = could not find container \"64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3\": container with ID starting with 64708e20bac95ef817be6f849912905726f0a35138271eb315056e779dec07f3 not found: ID does not exist" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.305715 4919 scope.go:117] "RemoveContainer" containerID="704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5" Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.307287 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5\": container with ID starting with 704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5 not found: ID does not exist" containerID="704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.307324 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5"} err="failed to get container status \"704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5\": rpc error: code = NotFound desc = could not find container \"704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5\": container with ID starting with 704f926a26eb1f83943af82f73395b8827735ef4003ab0f0d955ebb8116c59f5 not found: ID does not exist" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.323470 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-8l4wr"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.333375 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": read tcp 10.217.0.2:58954->10.217.0.214:8775: read: connection reset by peer" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.333636 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": read tcp 10.217.0.2:58944->10.217.0.214:8775: read: connection reset by peer" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.354615 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.375084 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-h94vd"] Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.376866 4919 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.376940 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:51.376920541 +0000 UTC m=+1538.618801149 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-config" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.377440 4919 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.377482 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:51.377472245 +0000 UTC m=+1538.619352853 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-httpd-config" not found Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.380432 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-dff6-account-create-update-8l4wr"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.386017 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-h94vd"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.392607 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bbb7-account-create-update-pmxgp"] Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.393383 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-jln88 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-bbb7-account-create-update-pmxgp" podUID="008d3aa2-636e-48bc-a09a-00541bc3bd5e" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.406002 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75f54b97c6-fj5s7" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:53170->10.217.0.167:9311: read: connection reset by peer" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.406011 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75f54b97c6-fj5s7" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:53156->10.217.0.167:9311: read: connection reset by peer" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.414845 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xlzz4"] Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.485652 4919 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.485726 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts podName:76a514a0-0d4c-4f6b-8ba7-cd5b4834d625 nodeName:}" failed. No retries permitted until 2026-03-10 22:15:48.485706717 +0000 UTC m=+1535.727587326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts") pod "root-account-create-update-xlzz4" (UID: "76a514a0-0d4c-4f6b-8ba7-cd5b4834d625") : configmap "openstack-scripts" not found Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.508975 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37dac1c8-963f-466f-977e-37b2fd98d32c" path="/var/lib/kubelet/pods/37dac1c8-963f-466f-977e-37b2fd98d32c/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.509605 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca" path="/var/lib/kubelet/pods/3ce2ca0f-3cf5-43aa-9ea4-687557fbc1ca/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.510122 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0" path="/var/lib/kubelet/pods/7c9bf9de-7c1a-436e-a8a0-6c987b34a5e0/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.512048 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8782e985-ff23-4580-bdfb-ef2dd9b540bc" path="/var/lib/kubelet/pods/8782e985-ff23-4580-bdfb-ef2dd9b540bc/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.513897 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e69198f-c4ad-40c4-b0f4-1a6e9dd17940" path="/var/lib/kubelet/pods/9e69198f-c4ad-40c4-b0f4-1a6e9dd17940/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.520252 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a946243a-c6de-4499-9c3c-c11073d02f8e" path="/var/lib/kubelet/pods/a946243a-c6de-4499-9c3c-c11073d02f8e/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.524628 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aff6348f-a0cf-4b67-a072-edcde9dcb3c4" path="/var/lib/kubelet/pods/aff6348f-a0cf-4b67-a072-edcde9dcb3c4/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.529110 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20682db-f5f4-4102-b0f9-662aeab1bd2a" path="/var/lib/kubelet/pods/b20682db-f5f4-4102-b0f9-662aeab1bd2a/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.529747 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61256d0-9ca6-4524-a19d-7efd32ab9724" path="/var/lib/kubelet/pods/e61256d0-9ca6-4524-a19d-7efd32ab9724/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.530840 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f60753e3-36e6-4155-8fcd-7460f2803ea4" path="/var/lib/kubelet/pods/f60753e3-36e6-4155-8fcd-7460f2803ea4/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.533681 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8965b29-3ef4-4db7-a67f-d905fe2e8c2c" path="/var/lib/kubelet/pods/f8965b29-3ef4-4db7-a67f-d905fe2e8c2c/volumes" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.586929 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jln88\" (UniqueName: \"kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88\") pod \"keystone-bbb7-account-create-update-pmxgp\" (UID: \"008d3aa2-636e-48bc-a09a-00541bc3bd5e\") " pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.587276 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts\") pod \"keystone-bbb7-account-create-update-pmxgp\" (UID: \"008d3aa2-636e-48bc-a09a-00541bc3bd5e\") " pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.587591 4919 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.587663 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts podName:008d3aa2-636e-48bc-a09a-00541bc3bd5e nodeName:}" failed. No retries permitted until 2026-03-10 22:15:48.58764369 +0000 UTC m=+1535.829524288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts") pod "keystone-bbb7-account-create-update-pmxgp" (UID: "008d3aa2-636e-48bc-a09a-00541bc3bd5e") : configmap "openstack-scripts" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.592518 4919 projected.go:194] Error preparing data for projected volume kube-api-access-jln88 for pod openstack/keystone-bbb7-account-create-update-pmxgp: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.593955 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88 podName:008d3aa2-636e-48bc-a09a-00541bc3bd5e nodeName:}" failed. No retries permitted until 2026-03-10 22:15:48.59393381 +0000 UTC m=+1535.835814418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jln88" (UniqueName: "kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88") pod "keystone-bbb7-account-create-update-pmxgp" (UID: "008d3aa2-636e-48bc-a09a-00541bc3bd5e") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.654640 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8e40-account-create-update-kkddk"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.706848 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8e40-account-create-update-kkddk"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.722381 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="9372011b-416f-484d-a873-fdda67baf9fe" containerName="galera" containerID="cri-o://c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4" gracePeriod=30 Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.780132 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-qh8rx"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.780591 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cdcf-account-create-update-qh8rx"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.787531 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 22:15:47 crc kubenswrapper[4919]: I0310 22:15:47.791842 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 22:15:47 crc kubenswrapper[4919]: E0310 22:15:47.825252 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31690f34_6b68_4470_a13e_e16121ec25d2.slice/crio-conmon-800bb678ea7e53aea235c1816dd3b24e1d5cc3ca3910d7d45b290926f9b56fcd.scope\": RecentStats: unable to find data in memory cache]" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.111842 4919 generic.go:334] "Generic (PLEG): container finished" podID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerID="fca95d527891107e2d3047ae093c871e26b3b61e0a45e89186497f024bbb6624" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.112202 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f54b97c6-fj5s7" event={"ID":"28d81dfb-640f-4748-ab70-e0b393e1e595","Type":"ContainerDied","Data":"fca95d527891107e2d3047ae093c871e26b3b61e0a45e89186497f024bbb6624"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.115787 4919 generic.go:334] "Generic (PLEG): container finished" podID="1f324194-64d5-4755-847b-f554b94e652c" containerID="4f27c36666ba7ecf2d24cedee59efe2a08c7b4e6c86f4fe4198918504c6bf578" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.115818 4919 generic.go:334] "Generic (PLEG): container finished" podID="1f324194-64d5-4755-847b-f554b94e652c" containerID="69fdc3c8e2ab199f6bb93d9e3a1a78edfb949241e41f07473825a631528f1dde" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.115849 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerDied","Data":"4f27c36666ba7ecf2d24cedee59efe2a08c7b4e6c86f4fe4198918504c6bf578"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.115866 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerDied","Data":"69fdc3c8e2ab199f6bb93d9e3a1a78edfb949241e41f07473825a631528f1dde"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.115880 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1f324194-64d5-4755-847b-f554b94e652c","Type":"ContainerDied","Data":"ab56ca52c6208a6b06a1615711c68c53b88659f15dafdf22e213fb0d75084cb2"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.115891 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab56ca52c6208a6b06a1615711c68c53b88659f15dafdf22e213fb0d75084cb2" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.119117 4919 generic.go:334] "Generic (PLEG): container finished" podID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerID="79e87bdd987eb81ea9f7ad47745afc59d0dd4ce7a69aa1af13ca054411b4739c" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.119157 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"91a933f1-aa44-4375-8f5c-e5f3567e6c8e","Type":"ContainerDied","Data":"79e87bdd987eb81ea9f7ad47745afc59d0dd4ce7a69aa1af13ca054411b4739c"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.124955 4919 generic.go:334] "Generic (PLEG): container finished" podID="515105ef-e538-4276-b682-7e05881dc7e8" containerID="6998100dd91ae8a0c4934a4b8c43b07c72cd35d8be7e9f5c1635f9179079c2ed" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.125048 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"515105ef-e538-4276-b682-7e05881dc7e8","Type":"ContainerDied","Data":"6998100dd91ae8a0c4934a4b8c43b07c72cd35d8be7e9f5c1635f9179079c2ed"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.126563 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9e8c-account-create-update-6q4j2" event={"ID":"a01f4397-9fee-4ff4-af76-ed0b37f04b28","Type":"ContainerDied","Data":"1744a839035086036e24ae77885b6783bbfb67bf0d0fade6f10eed98757e44fe"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.126584 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1744a839035086036e24ae77885b6783bbfb67bf0d0fade6f10eed98757e44fe" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.128403 4919 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-xlzz4" secret="" err="secret \"galera-openstack-dockercfg-nbkzl\" not found" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.128467 4919 scope.go:117] "RemoveContainer" containerID="3f184ab546adc93a6838bca186ee1a27fb82249fb7be1e6ac56b84aa0dcb13c5" Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.129256 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-xlzz4_openstack(76a514a0-0d4c-4f6b-8ba7-cd5b4834d625)\"" pod="openstack/root-account-create-update-xlzz4" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.129880 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-45c6-account-create-update-zczhm" event={"ID":"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b","Type":"ContainerDied","Data":"f277710fbf5ffb809d7f8e3ce03b09ed5a04584573c19e7fcd445716cb202ab0"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.129903 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f277710fbf5ffb809d7f8e3ce03b09ed5a04584573c19e7fcd445716cb202ab0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.131677 4919 generic.go:334] "Generic (PLEG): container finished" podID="31690f34-6b68-4470-a13e-e16121ec25d2" containerID="800bb678ea7e53aea235c1816dd3b24e1d5cc3ca3910d7d45b290926f9b56fcd" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.131725 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" event={"ID":"31690f34-6b68-4470-a13e-e16121ec25d2","Type":"ContainerDied","Data":"800bb678ea7e53aea235c1816dd3b24e1d5cc3ca3910d7d45b290926f9b56fcd"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.134200 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854d8d6bf4-kknjq" event={"ID":"981bb03c-23be-4bf8-a9f6-cb8a552f66a5","Type":"ContainerDied","Data":"cf163b74a00b3fc9c5f8e802c09f733ed4f45ba0e95aadd586f4fd561306b85d"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.134221 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf163b74a00b3fc9c5f8e802c09f733ed4f45ba0e95aadd586f4fd561306b85d" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.140335 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7bfe342f-267a-4239-a9cc-8df0e3d14a92","Type":"ContainerDied","Data":"5d4c7ff057178d1d7de6ad2d49ecc6b3b14b932f465cc80369c181b155bfb8be"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.140362 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d4c7ff057178d1d7de6ad2d49ecc6b3b14b932f465cc80369c181b155bfb8be" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.142381 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.147727 4919 generic.go:334] "Generic (PLEG): container finished" podID="a4a88061-cba8-4535-bf01-5285d8cbb79f" containerID="fb28d2d8eb9c98873f08b6f1830499a051d20df33a610f4a9e6624fa224475b0" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.147816 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4a88061-cba8-4535-bf01-5285d8cbb79f","Type":"ContainerDied","Data":"fb28d2d8eb9c98873f08b6f1830499a051d20df33a610f4a9e6624fa224475b0"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.160766 4919 generic.go:334] "Generic (PLEG): container finished" podID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerID="df8f79c23e11b14d9212f9cd7c7b374f297dc4c6a1b8f62a2988cc7af5ea3b27" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.160819 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4865c8ed-670d-41a0-b9fc-ba7697085e6b","Type":"ContainerDied","Data":"df8f79c23e11b14d9212f9cd7c7b374f297dc4c6a1b8f62a2988cc7af5ea3b27"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.161278 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.165073 4919 generic.go:334] "Generic (PLEG): container finished" podID="81489e39-0246-4065-8835-31b1e5da8431" containerID="0d25c07eec1b4976670c75603fd5da5476a97bed8734c71857c7dda9c1fa75bb" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.165136 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81489e39-0246-4065-8835-31b1e5da8431","Type":"ContainerDied","Data":"0d25c07eec1b4976670c75603fd5da5476a97bed8734c71857c7dda9c1fa75bb"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.165165 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81489e39-0246-4065-8835-31b1e5da8431","Type":"ContainerDied","Data":"24bef484547c2d8e19b8a71bb9c628aac6dba8274e3bc46da702bf218bca2791"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.165177 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24bef484547c2d8e19b8a71bb9c628aac6dba8274e3bc46da702bf218bca2791" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.169141 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.174842 4919 generic.go:334] "Generic (PLEG): container finished" podID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerID="22265554a653026f7008b3a597b22efe9ebe95b2013255f792f08efd3682fc62" exitCode=0 Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.174933 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab479995-b87a-46b8-9a4e-d9e95d556775","Type":"ContainerDied","Data":"22265554a653026f7008b3a597b22efe9ebe95b2013255f792f08efd3682fc62"} Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.176340 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea40-account-create-update-7bp9g" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.177078 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.178381 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.190021 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.271962 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.277305 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.306991 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-logs\") pod \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307043 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-operator-scripts\") pod \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\" (UID: \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307084 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-config-data\") pod \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307105 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czfj2\" (UniqueName: \"kubernetes.io/projected/81489e39-0246-4065-8835-31b1e5da8431-kube-api-access-czfj2\") pod \"81489e39-0246-4065-8835-31b1e5da8431\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307131 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-nova-metadata-tls-certs\") pod \"81489e39-0246-4065-8835-31b1e5da8431\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307148 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc4lr\" (UniqueName: \"kubernetes.io/projected/a01f4397-9fee-4ff4-af76-ed0b37f04b28-kube-api-access-tc4lr\") pod \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\" (UID: \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307203 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-config-data\") pod \"81489e39-0246-4065-8835-31b1e5da8431\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307244 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w27nx\" (UniqueName: \"kubernetes.io/projected/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-api-access-w27nx\") pod \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307268 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-combined-ca-bundle\") pod \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307303 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md4gv\" (UniqueName: \"kubernetes.io/projected/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-kube-api-access-md4gv\") pod \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307332 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-certs\") pod \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307362 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-internal-tls-certs\") pod \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307380 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-scripts\") pod \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307420 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-combined-ca-bundle\") pod \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307442 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-config\") pod \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\" (UID: \"7bfe342f-267a-4239-a9cc-8df0e3d14a92\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307462 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-public-tls-certs\") pod \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\" (UID: \"981bb03c-23be-4bf8-a9f6-cb8a552f66a5\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307508 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdhrj\" (UniqueName: \"kubernetes.io/projected/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-kube-api-access-kdhrj\") pod \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\" (UID: \"1c34af84-e5e2-4219-b7b2-bf1e2c2a731b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307526 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01f4397-9fee-4ff4-af76-ed0b37f04b28-operator-scripts\") pod \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\" (UID: \"a01f4397-9fee-4ff4-af76-ed0b37f04b28\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307553 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-combined-ca-bundle\") pod \"81489e39-0246-4065-8835-31b1e5da8431\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.307572 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81489e39-0246-4065-8835-31b1e5da8431-logs\") pod \"81489e39-0246-4065-8835-31b1e5da8431\" (UID: \"81489e39-0246-4065-8835-31b1e5da8431\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.308358 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81489e39-0246-4065-8835-31b1e5da8431-logs" (OuterVolumeSpecName: "logs") pod "81489e39-0246-4065-8835-31b1e5da8431" (UID: "81489e39-0246-4065-8835-31b1e5da8431"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.313474 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-logs" (OuterVolumeSpecName: "logs") pod "981bb03c-23be-4bf8-a9f6-cb8a552f66a5" (UID: "981bb03c-23be-4bf8-a9f6-cb8a552f66a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.313932 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01f4397-9fee-4ff4-af76-ed0b37f04b28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a01f4397-9fee-4ff4-af76-ed0b37f04b28" (UID: "a01f4397-9fee-4ff4-af76-ed0b37f04b28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.322887 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c34af84-e5e2-4219-b7b2-bf1e2c2a731b" (UID: "1c34af84-e5e2-4219-b7b2-bf1e2c2a731b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.323729 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-scripts" (OuterVolumeSpecName: "scripts") pod "981bb03c-23be-4bf8-a9f6-cb8a552f66a5" (UID: "981bb03c-23be-4bf8-a9f6-cb8a552f66a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.337944 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.345662 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81489e39-0246-4065-8835-31b1e5da8431-kube-api-access-czfj2" (OuterVolumeSpecName: "kube-api-access-czfj2") pod "81489e39-0246-4065-8835-31b1e5da8431" (UID: "81489e39-0246-4065-8835-31b1e5da8431"). InnerVolumeSpecName "kube-api-access-czfj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.345721 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-kube-api-access-kdhrj" (OuterVolumeSpecName: "kube-api-access-kdhrj") pod "1c34af84-e5e2-4219-b7b2-bf1e2c2a731b" (UID: "1c34af84-e5e2-4219-b7b2-bf1e2c2a731b"). InnerVolumeSpecName "kube-api-access-kdhrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.346551 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01f4397-9fee-4ff4-af76-ed0b37f04b28-kube-api-access-tc4lr" (OuterVolumeSpecName: "kube-api-access-tc4lr") pod "a01f4397-9fee-4ff4-af76-ed0b37f04b28" (UID: "a01f4397-9fee-4ff4-af76-ed0b37f04b28"). InnerVolumeSpecName "kube-api-access-tc4lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.353813 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-api-access-w27nx" (OuterVolumeSpecName: "kube-api-access-w27nx") pod "7bfe342f-267a-4239-a9cc-8df0e3d14a92" (UID: "7bfe342f-267a-4239-a9cc-8df0e3d14a92"). InnerVolumeSpecName "kube-api-access-w27nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.354153 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-kube-api-access-md4gv" (OuterVolumeSpecName: "kube-api-access-md4gv") pod "981bb03c-23be-4bf8-a9f6-cb8a552f66a5" (UID: "981bb03c-23be-4bf8-a9f6-cb8a552f66a5"). InnerVolumeSpecName "kube-api-access-md4gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.372985 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-7bp9g"] Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.380668 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ea40-account-create-update-7bp9g"] Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.402167 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412512 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-run-httpd\") pod \"1f324194-64d5-4755-847b-f554b94e652c\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412596 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-ceilometer-tls-certs\") pod \"1f324194-64d5-4755-847b-f554b94e652c\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412650 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-internal-tls-certs\") pod \"515105ef-e538-4276-b682-7e05881dc7e8\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412684 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-sg-core-conf-yaml\") pod \"1f324194-64d5-4755-847b-f554b94e652c\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412702 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-log-httpd\") pod \"1f324194-64d5-4755-847b-f554b94e652c\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412743 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-combined-ca-bundle\") pod \"1f324194-64d5-4755-847b-f554b94e652c\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412762 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-combined-ca-bundle\") pod \"515105ef-e538-4276-b682-7e05881dc7e8\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412784 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-config-data\") pod \"1f324194-64d5-4755-847b-f554b94e652c\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412804 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-config-data\") pod \"515105ef-e538-4276-b682-7e05881dc7e8\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412823 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515105ef-e538-4276-b682-7e05881dc7e8-logs\") pod \"515105ef-e538-4276-b682-7e05881dc7e8\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412853 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xzsn\" (UniqueName: \"kubernetes.io/projected/515105ef-e538-4276-b682-7e05881dc7e8-kube-api-access-6xzsn\") pod \"515105ef-e538-4276-b682-7e05881dc7e8\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412868 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-scripts\") pod \"1f324194-64d5-4755-847b-f554b94e652c\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412933 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bq6r\" (UniqueName: \"kubernetes.io/projected/1f324194-64d5-4755-847b-f554b94e652c-kube-api-access-7bq6r\") pod \"1f324194-64d5-4755-847b-f554b94e652c\" (UID: \"1f324194-64d5-4755-847b-f554b94e652c\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.412966 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-public-tls-certs\") pod \"515105ef-e538-4276-b682-7e05881dc7e8\" (UID: \"515105ef-e538-4276-b682-7e05881dc7e8\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413339 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czfj2\" (UniqueName: \"kubernetes.io/projected/81489e39-0246-4065-8835-31b1e5da8431-kube-api-access-czfj2\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413364 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc4lr\" (UniqueName: \"kubernetes.io/projected/a01f4397-9fee-4ff4-af76-ed0b37f04b28-kube-api-access-tc4lr\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413376 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w27nx\" (UniqueName: \"kubernetes.io/projected/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-api-access-w27nx\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413400 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md4gv\" (UniqueName: \"kubernetes.io/projected/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-kube-api-access-md4gv\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413409 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413421 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdhrj\" (UniqueName: \"kubernetes.io/projected/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-kube-api-access-kdhrj\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413429 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a01f4397-9fee-4ff4-af76-ed0b37f04b28-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413438 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81489e39-0246-4065-8835-31b1e5da8431-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413447 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413455 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.413629 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/515105ef-e538-4276-b682-7e05881dc7e8-logs" (OuterVolumeSpecName: "logs") pod "515105ef-e538-4276-b682-7e05881dc7e8" (UID: "515105ef-e538-4276-b682-7e05881dc7e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.417059 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.420438 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1f324194-64d5-4755-847b-f554b94e652c" (UID: "1f324194-64d5-4755-847b-f554b94e652c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.422527 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1f324194-64d5-4755-847b-f554b94e652c" (UID: "1f324194-64d5-4755-847b-f554b94e652c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.424011 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.434516 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "7bfe342f-267a-4239-a9cc-8df0e3d14a92" (UID: "7bfe342f-267a-4239-a9cc-8df0e3d14a92"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.467585 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-scripts" (OuterVolumeSpecName: "scripts") pod "1f324194-64d5-4755-847b-f554b94e652c" (UID: "1f324194-64d5-4755-847b-f554b94e652c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.467789 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515105ef-e538-4276-b682-7e05881dc7e8-kube-api-access-6xzsn" (OuterVolumeSpecName: "kube-api-access-6xzsn") pod "515105ef-e538-4276-b682-7e05881dc7e8" (UID: "515105ef-e538-4276-b682-7e05881dc7e8"). InnerVolumeSpecName "kube-api-access-6xzsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.468651 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f324194-64d5-4755-847b-f554b94e652c-kube-api-access-7bq6r" (OuterVolumeSpecName: "kube-api-access-7bq6r") pod "1f324194-64d5-4755-847b-f554b94e652c" (UID: "1f324194-64d5-4755-847b-f554b94e652c"). InnerVolumeSpecName "kube-api-access-7bq6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522205 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-combined-ca-bundle\") pod \"28d81dfb-640f-4748-ab70-e0b393e1e595\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522283 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d81dfb-640f-4748-ab70-e0b393e1e595-logs\") pod \"28d81dfb-640f-4748-ab70-e0b393e1e595\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522325 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data\") pod \"31690f34-6b68-4470-a13e-e16121ec25d2\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522360 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t2rj\" (UniqueName: \"kubernetes.io/projected/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-kube-api-access-4t2rj\") pod \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522401 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data-custom\") pod \"28d81dfb-640f-4748-ab70-e0b393e1e595\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522493 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-internal-tls-certs\") pod \"28d81dfb-640f-4748-ab70-e0b393e1e595\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522519 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-combined-ca-bundle\") pod \"31690f34-6b68-4470-a13e-e16121ec25d2\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522539 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-public-tls-certs\") pod \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522645 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f8bc\" (UniqueName: \"kubernetes.io/projected/28d81dfb-640f-4748-ab70-e0b393e1e595-kube-api-access-7f8bc\") pod \"28d81dfb-640f-4748-ab70-e0b393e1e595\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522703 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data-custom\") pod \"31690f34-6b68-4470-a13e-e16121ec25d2\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522732 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25lf7\" (UniqueName: \"kubernetes.io/projected/31690f34-6b68-4470-a13e-e16121ec25d2-kube-api-access-25lf7\") pod \"31690f34-6b68-4470-a13e-e16121ec25d2\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522746 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-config-data\") pod \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522762 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data\") pod \"28d81dfb-640f-4748-ab70-e0b393e1e595\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522779 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522797 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31690f34-6b68-4470-a13e-e16121ec25d2-logs\") pod \"31690f34-6b68-4470-a13e-e16121ec25d2\" (UID: \"31690f34-6b68-4470-a13e-e16121ec25d2\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522815 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-public-tls-certs\") pod \"28d81dfb-640f-4748-ab70-e0b393e1e595\" (UID: \"28d81dfb-640f-4748-ab70-e0b393e1e595\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522832 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-combined-ca-bundle\") pod \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522854 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-httpd-run\") pod \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522877 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-logs\") pod \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.522903 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-scripts\") pod \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\" (UID: \"91a933f1-aa44-4375-8f5c-e5f3567e6c8e\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.523512 4919 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.523524 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515105ef-e538-4276-b682-7e05881dc7e8-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.523533 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xzsn\" (UniqueName: \"kubernetes.io/projected/515105ef-e538-4276-b682-7e05881dc7e8-kube-api-access-6xzsn\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.523543 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.523552 4919 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.523560 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bq6r\" (UniqueName: \"kubernetes.io/projected/1f324194-64d5-4755-847b-f554b94e652c-kube-api-access-7bq6r\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.523569 4919 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f324194-64d5-4755-847b-f554b94e652c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.523620 4919 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.523662 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts podName:76a514a0-0d4c-4f6b-8ba7-cd5b4834d625 nodeName:}" failed. No retries permitted until 2026-03-10 22:15:50.52364762 +0000 UTC m=+1537.765528228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts") pod "root-account-create-update-xlzz4" (UID: "76a514a0-0d4c-4f6b-8ba7-cd5b4834d625") : configmap "openstack-scripts" not found Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.525598 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "91a933f1-aa44-4375-8f5c-e5f3567e6c8e" (UID: "91a933f1-aa44-4375-8f5c-e5f3567e6c8e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.527271 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-logs" (OuterVolumeSpecName: "logs") pod "91a933f1-aa44-4375-8f5c-e5f3567e6c8e" (UID: "91a933f1-aa44-4375-8f5c-e5f3567e6c8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.528117 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31690f34-6b68-4470-a13e-e16121ec25d2-logs" (OuterVolumeSpecName: "logs") pod "31690f34-6b68-4470-a13e-e16121ec25d2" (UID: "31690f34-6b68-4470-a13e-e16121ec25d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.528644 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bfe342f-267a-4239-a9cc-8df0e3d14a92" (UID: "7bfe342f-267a-4239-a9cc-8df0e3d14a92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.531246 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-kube-api-access-4t2rj" (OuterVolumeSpecName: "kube-api-access-4t2rj") pod "91a933f1-aa44-4375-8f5c-e5f3567e6c8e" (UID: "91a933f1-aa44-4375-8f5c-e5f3567e6c8e"). InnerVolumeSpecName "kube-api-access-4t2rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.534241 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d81dfb-640f-4748-ab70-e0b393e1e595-logs" (OuterVolumeSpecName: "logs") pod "28d81dfb-640f-4748-ab70-e0b393e1e595" (UID: "28d81dfb-640f-4748-ab70-e0b393e1e595"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.556958 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-scripts" (OuterVolumeSpecName: "scripts") pod "91a933f1-aa44-4375-8f5c-e5f3567e6c8e" (UID: "91a933f1-aa44-4375-8f5c-e5f3567e6c8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.567178 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "91a933f1-aa44-4375-8f5c-e5f3567e6c8e" (UID: "91a933f1-aa44-4375-8f5c-e5f3567e6c8e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.589206 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d81dfb-640f-4748-ab70-e0b393e1e595-kube-api-access-7f8bc" (OuterVolumeSpecName: "kube-api-access-7f8bc") pod "28d81dfb-640f-4748-ab70-e0b393e1e595" (UID: "28d81dfb-640f-4748-ab70-e0b393e1e595"). InnerVolumeSpecName "kube-api-access-7f8bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.589356 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "31690f34-6b68-4470-a13e-e16121ec25d2" (UID: "31690f34-6b68-4470-a13e-e16121ec25d2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.589612 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28d81dfb-640f-4748-ab70-e0b393e1e595" (UID: "28d81dfb-640f-4748-ab70-e0b393e1e595"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.593305 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "515105ef-e538-4276-b682-7e05881dc7e8" (UID: "515105ef-e538-4276-b682-7e05881dc7e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.597131 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31690f34-6b68-4470-a13e-e16121ec25d2-kube-api-access-25lf7" (OuterVolumeSpecName: "kube-api-access-25lf7") pod "31690f34-6b68-4470-a13e-e16121ec25d2" (UID: "31690f34-6b68-4470-a13e-e16121ec25d2"). InnerVolumeSpecName "kube-api-access-25lf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631382 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jln88\" (UniqueName: \"kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88\") pod \"keystone-bbb7-account-create-update-pmxgp\" (UID: \"008d3aa2-636e-48bc-a09a-00541bc3bd5e\") " pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631626 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts\") pod \"keystone-bbb7-account-create-update-pmxgp\" (UID: \"008d3aa2-636e-48bc-a09a-00541bc3bd5e\") " pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631864 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f8bc\" (UniqueName: \"kubernetes.io/projected/28d81dfb-640f-4748-ab70-e0b393e1e595-kube-api-access-7f8bc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631878 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631887 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25lf7\" (UniqueName: \"kubernetes.io/projected/31690f34-6b68-4470-a13e-e16121ec25d2-kube-api-access-25lf7\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631907 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631917 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31690f34-6b68-4470-a13e-e16121ec25d2-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631925 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631936 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631946 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631957 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d81dfb-640f-4748-ab70-e0b393e1e595-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631967 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t2rj\" (UniqueName: \"kubernetes.io/projected/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-kube-api-access-4t2rj\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631976 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631985 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.631994 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.634433 4919 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.634508 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts podName:008d3aa2-636e-48bc-a09a-00541bc3bd5e nodeName:}" failed. No retries permitted until 2026-03-10 22:15:50.634492993 +0000 UTC m=+1537.876373601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts") pod "keystone-bbb7-account-create-update-pmxgp" (UID: "008d3aa2-636e-48bc-a09a-00541bc3bd5e") : configmap "openstack-scripts" not found Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.636709 4919 projected.go:194] Error preparing data for projected volume kube-api-access-jln88 for pod openstack/keystone-bbb7-account-create-update-pmxgp: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.636869 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88 podName:008d3aa2-636e-48bc-a09a-00541bc3bd5e nodeName:}" failed. No retries permitted until 2026-03-10 22:15:50.636846116 +0000 UTC m=+1537.878726794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jln88" (UniqueName: "kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88") pod "keystone-bbb7-account-create-update-pmxgp" (UID: "008d3aa2-636e-48bc-a09a-00541bc3bd5e") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.655733 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-config-data" (OuterVolumeSpecName: "config-data") pod "81489e39-0246-4065-8835-31b1e5da8431" (UID: "81489e39-0246-4065-8835-31b1e5da8431"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.697772 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-config-data" (OuterVolumeSpecName: "config-data") pod "515105ef-e538-4276-b682-7e05881dc7e8" (UID: "515105ef-e538-4276-b682-7e05881dc7e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.699439 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-config-data" (OuterVolumeSpecName: "config-data") pod "981bb03c-23be-4bf8-a9f6-cb8a552f66a5" (UID: "981bb03c-23be-4bf8-a9f6-cb8a552f66a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.700005 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data" (OuterVolumeSpecName: "config-data") pod "31690f34-6b68-4470-a13e-e16121ec25d2" (UID: "31690f34-6b68-4470-a13e-e16121ec25d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.737894 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.737926 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.737935 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.737945 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.740573 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.741881 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.744154 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.744197 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4a7ad3ed-9144-4a21-808c-23d613354a2f" containerName="nova-cell0-conductor-conductor" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.764422 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data" (OuterVolumeSpecName: "config-data") pod "28d81dfb-640f-4748-ab70-e0b393e1e595" (UID: "28d81dfb-640f-4748-ab70-e0b393e1e595"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.791164 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91a933f1-aa44-4375-8f5c-e5f3567e6c8e" (UID: "91a933f1-aa44-4375-8f5c-e5f3567e6c8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.795320 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "981bb03c-23be-4bf8-a9f6-cb8a552f66a5" (UID: "981bb03c-23be-4bf8-a9f6-cb8a552f66a5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.800691 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1f324194-64d5-4755-847b-f554b94e652c" (UID: "1f324194-64d5-4755-847b-f554b94e652c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.815442 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.817019 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.823425 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 22:15:48 crc kubenswrapper[4919]: E0310 22:15:48.823480 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9700fb27-6a74-428d-a2e6-71c237b3e054" containerName="nova-scheduler-scheduler" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.824547 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "81489e39-0246-4065-8835-31b1e5da8431" (UID: "81489e39-0246-4065-8835-31b1e5da8431"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.839826 4919 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.839849 4919 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.839862 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.839872 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.839881 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.843177 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-config-data" (OuterVolumeSpecName: "config-data") pod "1f324194-64d5-4755-847b-f554b94e652c" (UID: "1f324194-64d5-4755-847b-f554b94e652c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.852328 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.862508 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28d81dfb-640f-4748-ab70-e0b393e1e595" (UID: "28d81dfb-640f-4748-ab70-e0b393e1e595"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.863295 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "28d81dfb-640f-4748-ab70-e0b393e1e595" (UID: "28d81dfb-640f-4748-ab70-e0b393e1e595"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.863533 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "7bfe342f-267a-4239-a9cc-8df0e3d14a92" (UID: "7bfe342f-267a-4239-a9cc-8df0e3d14a92"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.869840 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.872370 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1f324194-64d5-4755-847b-f554b94e652c" (UID: "1f324194-64d5-4755-847b-f554b94e652c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.877166 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.880525 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31690f34-6b68-4470-a13e-e16121ec25d2" (UID: "31690f34-6b68-4470-a13e-e16121ec25d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.881566 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81489e39-0246-4065-8835-31b1e5da8431" (UID: "81489e39-0246-4065-8835-31b1e5da8431"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.883713 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "515105ef-e538-4276-b682-7e05881dc7e8" (UID: "515105ef-e538-4276-b682-7e05881dc7e8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.893746 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.903351 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "91a933f1-aa44-4375-8f5c-e5f3567e6c8e" (UID: "91a933f1-aa44-4375-8f5c-e5f3567e6c8e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.903958 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f324194-64d5-4755-847b-f554b94e652c" (UID: "1f324194-64d5-4755-847b-f554b94e652c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.908564 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "981bb03c-23be-4bf8-a9f6-cb8a552f66a5" (UID: "981bb03c-23be-4bf8-a9f6-cb8a552f66a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.910029 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-config-data" (OuterVolumeSpecName: "config-data") pod "91a933f1-aa44-4375-8f5c-e5f3567e6c8e" (UID: "91a933f1-aa44-4375-8f5c-e5f3567e6c8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.918938 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "28d81dfb-640f-4748-ab70-e0b393e1e595" (UID: "28d81dfb-640f-4748-ab70-e0b393e1e595"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.942900 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8msqp\" (UniqueName: \"kubernetes.io/projected/ab479995-b87a-46b8-9a4e-d9e95d556775-kube-api-access-8msqp\") pod \"ab479995-b87a-46b8-9a4e-d9e95d556775\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.942952 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-httpd-run\") pod \"ab479995-b87a-46b8-9a4e-d9e95d556775\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.942971 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4865c8ed-670d-41a0-b9fc-ba7697085e6b-etc-machine-id\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.942988 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ab479995-b87a-46b8-9a4e-d9e95d556775\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.943019 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-scripts\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.943040 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-scripts\") pod \"ab479995-b87a-46b8-9a4e-d9e95d556775\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.943076 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j46h\" (UniqueName: \"kubernetes.io/projected/4865c8ed-670d-41a0-b9fc-ba7697085e6b-kube-api-access-5j46h\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.943092 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-config-data\") pod \"ab479995-b87a-46b8-9a4e-d9e95d556775\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.943110 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-internal-tls-certs\") pod \"ab479995-b87a-46b8-9a4e-d9e95d556775\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.943148 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-combined-ca-bundle\") pod \"ab479995-b87a-46b8-9a4e-d9e95d556775\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.943181 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-public-tls-certs\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.943207 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-combined-ca-bundle\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944480 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4865c8ed-670d-41a0-b9fc-ba7697085e6b-logs\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944521 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-logs\") pod \"ab479995-b87a-46b8-9a4e-d9e95d556775\" (UID: \"ab479995-b87a-46b8-9a4e-d9e95d556775\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944539 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data-custom\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944570 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944585 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-internal-tls-certs\") pod \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\" (UID: \"4865c8ed-670d-41a0-b9fc-ba7697085e6b\") " Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944649 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4865c8ed-670d-41a0-b9fc-ba7697085e6b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944958 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31690f34-6b68-4470-a13e-e16121ec25d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944974 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944984 4919 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfe342f-267a-4239-a9cc-8df0e3d14a92-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.944994 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945002 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945011 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81489e39-0246-4065-8835-31b1e5da8431-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945019 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a933f1-aa44-4375-8f5c-e5f3567e6c8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945027 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945035 4919 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4865c8ed-670d-41a0-b9fc-ba7697085e6b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945043 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945051 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945061 4919 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945069 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945078 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f324194-64d5-4755-847b-f554b94e652c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945086 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28d81dfb-640f-4748-ab70-e0b393e1e595-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.945336 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ab479995-b87a-46b8-9a4e-d9e95d556775" (UID: "ab479995-b87a-46b8-9a4e-d9e95d556775"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.949784 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4865c8ed-670d-41a0-b9fc-ba7697085e6b-kube-api-access-5j46h" (OuterVolumeSpecName: "kube-api-access-5j46h") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "kube-api-access-5j46h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.949861 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4865c8ed-670d-41a0-b9fc-ba7697085e6b-logs" (OuterVolumeSpecName: "logs") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.950369 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-logs" (OuterVolumeSpecName: "logs") pod "ab479995-b87a-46b8-9a4e-d9e95d556775" (UID: "ab479995-b87a-46b8-9a4e-d9e95d556775"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.950781 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab479995-b87a-46b8-9a4e-d9e95d556775-kube-api-access-8msqp" (OuterVolumeSpecName: "kube-api-access-8msqp") pod "ab479995-b87a-46b8-9a4e-d9e95d556775" (UID: "ab479995-b87a-46b8-9a4e-d9e95d556775"). InnerVolumeSpecName "kube-api-access-8msqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.951848 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-scripts" (OuterVolumeSpecName: "scripts") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.954364 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "ab479995-b87a-46b8-9a4e-d9e95d556775" (UID: "ab479995-b87a-46b8-9a4e-d9e95d556775"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.959580 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "981bb03c-23be-4bf8-a9f6-cb8a552f66a5" (UID: "981bb03c-23be-4bf8-a9f6-cb8a552f66a5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.960574 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-scripts" (OuterVolumeSpecName: "scripts") pod "ab479995-b87a-46b8-9a4e-d9e95d556775" (UID: "ab479995-b87a-46b8-9a4e-d9e95d556775"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.968014 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.979708 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab479995-b87a-46b8-9a4e-d9e95d556775" (UID: "ab479995-b87a-46b8-9a4e-d9e95d556775"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.983367 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "515105ef-e538-4276-b682-7e05881dc7e8" (UID: "515105ef-e538-4276-b682-7e05881dc7e8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:48 crc kubenswrapper[4919]: I0310 22:15:48.997323 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.019866 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.024141 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.034119 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-config-data" (OuterVolumeSpecName: "config-data") pod "ab479995-b87a-46b8-9a4e-d9e95d556775" (UID: "ab479995-b87a-46b8-9a4e-d9e95d556775"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.037569 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ab479995-b87a-46b8-9a4e-d9e95d556775" (UID: "ab479995-b87a-46b8-9a4e-d9e95d556775"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.046524 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-memcached-tls-certs\") pod \"a4a88061-cba8-4535-bf01-5285d8cbb79f\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.046667 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj8lc\" (UniqueName: \"kubernetes.io/projected/a4a88061-cba8-4535-bf01-5285d8cbb79f-kube-api-access-sj8lc\") pod \"a4a88061-cba8-4535-bf01-5285d8cbb79f\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.046779 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-combined-ca-bundle\") pod \"a4a88061-cba8-4535-bf01-5285d8cbb79f\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.046853 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-config-data\") pod \"a4a88061-cba8-4535-bf01-5285d8cbb79f\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.046877 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-kolla-config\") pod \"a4a88061-cba8-4535-bf01-5285d8cbb79f\" (UID: \"a4a88061-cba8-4535-bf01-5285d8cbb79f\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047331 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047350 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047363 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/981bb03c-23be-4bf8-a9f6-cb8a552f66a5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047373 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047386 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8msqp\" (UniqueName: \"kubernetes.io/projected/ab479995-b87a-46b8-9a4e-d9e95d556775-kube-api-access-8msqp\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047416 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab479995-b87a-46b8-9a4e-d9e95d556775-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047451 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047464 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047476 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047487 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j46h\" (UniqueName: \"kubernetes.io/projected/4865c8ed-670d-41a0-b9fc-ba7697085e6b-kube-api-access-5j46h\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047498 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047510 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047521 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab479995-b87a-46b8-9a4e-d9e95d556775-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047533 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515105ef-e538-4276-b682-7e05881dc7e8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047543 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047556 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.047569 4919 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4865c8ed-670d-41a0-b9fc-ba7697085e6b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.050488 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-config-data" (OuterVolumeSpecName: "config-data") pod "a4a88061-cba8-4535-bf01-5285d8cbb79f" (UID: "a4a88061-cba8-4535-bf01-5285d8cbb79f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.050512 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a4a88061-cba8-4535-bf01-5285d8cbb79f" (UID: "a4a88061-cba8-4535-bf01-5285d8cbb79f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.055032 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a88061-cba8-4535-bf01-5285d8cbb79f-kube-api-access-sj8lc" (OuterVolumeSpecName: "kube-api-access-sj8lc") pod "a4a88061-cba8-4535-bf01-5285d8cbb79f" (UID: "a4a88061-cba8-4535-bf01-5285d8cbb79f"). InnerVolumeSpecName "kube-api-access-sj8lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.068243 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.077980 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data" (OuterVolumeSpecName: "config-data") pod "4865c8ed-670d-41a0-b9fc-ba7697085e6b" (UID: "4865c8ed-670d-41a0-b9fc-ba7697085e6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.086024 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a88061-cba8-4535-bf01-5285d8cbb79f" (UID: "a4a88061-cba8-4535-bf01-5285d8cbb79f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.095935 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "a4a88061-cba8-4535-bf01-5285d8cbb79f" (UID: "a4a88061-cba8-4535-bf01-5285d8cbb79f"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.149657 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj8lc\" (UniqueName: \"kubernetes.io/projected/a4a88061-cba8-4535-bf01-5285d8cbb79f-kube-api-access-sj8lc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.149688 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4865c8ed-670d-41a0-b9fc-ba7697085e6b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.149699 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.149710 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.149720 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.149729 4919 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4a88061-cba8-4535-bf01-5285d8cbb79f-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.149737 4919 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a88061-cba8-4535-bf01-5285d8cbb79f-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.192149 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f54b97c6-fj5s7" event={"ID":"28d81dfb-640f-4748-ab70-e0b393e1e595","Type":"ContainerDied","Data":"31a2f38b1aa47f4dfe221b42d83a3e6d2c6d06a0c2690f09cd1a5afb08ce4465"} Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.192192 4919 scope.go:117] "RemoveContainer" containerID="fca95d527891107e2d3047ae093c871e26b3b61e0a45e89186497f024bbb6624" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.192317 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75f54b97c6-fj5s7" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.195508 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" event={"ID":"31690f34-6b68-4470-a13e-e16121ec25d2","Type":"ContainerDied","Data":"3cafad43b499322396f941900bf49a47d1ab90bc7b553d3f2f80bfff3b36dfe5"} Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.195606 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fd8f54c58-gtj5m" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.200022 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4a88061-cba8-4535-bf01-5285d8cbb79f","Type":"ContainerDied","Data":"f59a877853ec059b4132b0cedbe4db1e550e4486c4e86be55ab299173c3b9e43"} Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.200118 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.205367 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4865c8ed-670d-41a0-b9fc-ba7697085e6b","Type":"ContainerDied","Data":"1d98b5abcd7706e0887327cf5b568d96c7e0525ca9b5bfec7812cdf1a518c16e"} Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.205401 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.208509 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"91a933f1-aa44-4375-8f5c-e5f3567e6c8e","Type":"ContainerDied","Data":"952015c9db412080f67ce57047b4d32bb65eacfc5d636344933b226c8847b438"} Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.208537 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.210426 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"515105ef-e538-4276-b682-7e05881dc7e8","Type":"ContainerDied","Data":"8ae11bf041c61f12bbef2122a68135eae2ca34470bbe4685da29db5222a51146"} Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.210580 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.214490 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bbb7-account-create-update-pmxgp" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.214565 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9e8c-account-create-update-6q4j2" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.216782 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.216811 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-45c6-account-create-update-zczhm" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.216943 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab479995-b87a-46b8-9a4e-d9e95d556775","Type":"ContainerDied","Data":"98e94965efcd8e804b4844271d3615e1dbc917b10cf7d2bc677998fc1d6a9654"} Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.216994 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.217037 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.217255 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-854d8d6bf4-kknjq" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.217628 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.311646 4919 scope.go:117] "RemoveContainer" containerID="4f419ebd70f99390116647c66037cbdcde78f060d7d9ba4c1e4bafbc7b53452c" Mar 10 22:15:49 crc kubenswrapper[4919]: E0310 22:15:49.356707 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Mar 10 22:15:49 crc kubenswrapper[4919]: E0310 22:15:49.356788 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:57.356768212 +0000 UTC m=+1544.598648860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-scripts" not found Mar 10 22:15:49 crc kubenswrapper[4919]: E0310 22:15:49.356808 4919 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Mar 10 22:15:49 crc kubenswrapper[4919]: E0310 22:15:49.356846 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config podName:f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b nodeName:}" failed. No retries permitted until 2026-03-10 22:15:57.356833984 +0000 UTC m=+1544.598714632 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config") pod "ovn-northd-0" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b") : configmap "ovnnorthd-config" not found Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.416900 4919 scope.go:117] "RemoveContainer" containerID="800bb678ea7e53aea235c1816dd3b24e1d5cc3ca3910d7d45b290926f9b56fcd" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.457591 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75f54b97c6-fj5s7"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.460321 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-75f54b97c6-fj5s7"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.490734 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" path="/var/lib/kubelet/pods/28d81dfb-640f-4748-ab70-e0b393e1e595/volumes" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.512728 4919 scope.go:117] "RemoveContainer" containerID="94166ce461bf0d2113bc5e17cabdb4512063e9d512e978c63a5719892a6a8251" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.513182 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ef9179-69db-49ab-a4e6-2e2b815fc260" path="/var/lib/kubelet/pods/37ef9179-69db-49ab-a4e6-2e2b815fc260/volumes" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.515072 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5850b9-d946-4b1a-9171-718243c78596" path="/var/lib/kubelet/pods/9d5850b9-d946-4b1a-9171-718243c78596/volumes" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.515494 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcceccb-6413-4cf8-972e-7744b99c626e" path="/var/lib/kubelet/pods/cdcceccb-6413-4cf8-972e-7744b99c626e/volumes" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.515931 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec49f65c-e8af-44a1-b464-af2a86b299fc" path="/var/lib/kubelet/pods/ec49f65c-e8af-44a1-b464-af2a86b299fc/volumes" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.525288 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.537587 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.561880 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bbb7-account-create-update-pmxgp"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.576362 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bbb7-account-create-update-pmxgp"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.588055 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.605885 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.608686 4919 scope.go:117] "RemoveContainer" containerID="fb28d2d8eb9c98873f08b6f1830499a051d20df33a610f4a9e6624fa224475b0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.615868 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-fd8f54c58-gtj5m"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.624774 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-fd8f54c58-gtj5m"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.635727 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.643116 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.664574 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.668931 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/008d3aa2-636e-48bc-a09a-00541bc3bd5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.668961 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jln88\" (UniqueName: \"kubernetes.io/projected/008d3aa2-636e-48bc-a09a-00541bc3bd5e-kube-api-access-jln88\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.696547 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.698488 4919 scope.go:117] "RemoveContainer" containerID="df8f79c23e11b14d9212f9cd7c7b374f297dc4c6a1b8f62a2988cc7af5ea3b27" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.707953 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.718646 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.728125 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.746915 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.759206 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.764458 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.777135 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9e8c-account-create-update-6q4j2"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.784577 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9e8c-account-create-update-6q4j2"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.791206 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.795889 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.801049 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-854d8d6bf4-kknjq"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.806108 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-854d8d6bf4-kknjq"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.815855 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-45c6-account-create-update-zczhm"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.818075 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.821849 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-45c6-account-create-update-zczhm"] Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.823762 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b/ovn-northd/0.log" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.823840 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.836096 4919 scope.go:117] "RemoveContainer" containerID="160abe754665fdfbb180f5bc071ae95530a567d07d30547bc0bedcf6ffce0c0d" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.879478 4919 scope.go:117] "RemoveContainer" containerID="79e87bdd987eb81ea9f7ad47745afc59d0dd4ce7a69aa1af13ca054411b4739c" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.899534 4919 scope.go:117] "RemoveContainer" containerID="ceb6023d0f542d943ccfa4398a55aeeb75cf652b0f4a2b2be0237840184075d5" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.924325 4919 scope.go:117] "RemoveContainer" containerID="6998100dd91ae8a0c4934a4b8c43b07c72cd35d8be7e9f5c1635f9179079c2ed" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.942151 4919 scope.go:117] "RemoveContainer" containerID="9f98722ad0d1eaba724d3a1905d648a1f6aa5618c0b6cbe31210b8bf1f36f489" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.961916 4919 scope.go:117] "RemoveContainer" containerID="22265554a653026f7008b3a597b22efe9ebe95b2013255f792f08efd3682fc62" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.972728 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts\") pod \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\" (UID: \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.972774 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-metrics-certs-tls-certs\") pod \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.972820 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmhrb\" (UniqueName: \"kubernetes.io/projected/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-kube-api-access-rmhrb\") pod \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.972855 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vf95\" (UniqueName: \"kubernetes.io/projected/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-kube-api-access-2vf95\") pod \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\" (UID: \"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.972884 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-rundir\") pod \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.972964 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts\") pod \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.972992 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-combined-ca-bundle\") pod \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.973032 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config\") pod \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.973052 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-northd-tls-certs\") pod \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\" (UID: \"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b\") " Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.973236 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" (UID: "76a514a0-0d4c-4f6b-8ba7-cd5b4834d625"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.973659 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.974110 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts" (OuterVolumeSpecName: "scripts") pod "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.975664 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config" (OuterVolumeSpecName: "config") pod "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.975720 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.993667 4919 scope.go:117] "RemoveContainer" containerID="0fa75078e11e8939f0a39526b2508ccb8e7f4c3ea23641588f1b3b509c8c8e82" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.993747 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-kube-api-access-rmhrb" (OuterVolumeSpecName: "kube-api-access-rmhrb") pod "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b"). InnerVolumeSpecName "kube-api-access-rmhrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:49 crc kubenswrapper[4919]: I0310 22:15:49.993885 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-kube-api-access-2vf95" (OuterVolumeSpecName: "kube-api-access-2vf95") pod "76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" (UID: "76a514a0-0d4c-4f6b-8ba7-cd5b4834d625"). InnerVolumeSpecName "kube-api-access-2vf95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.013526 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.053343 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.055623 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" (UID: "f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.076437 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.076468 4919 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.076481 4919 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.076493 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmhrb\" (UniqueName: \"kubernetes.io/projected/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-kube-api-access-rmhrb\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.076504 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vf95\" (UniqueName: \"kubernetes.io/projected/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625-kube-api-access-2vf95\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.076517 4919 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.076528 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.076540 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.183704 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.185105 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.186578 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.186659 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="62814b8d-8679-4350-be7d-5f729f901846" containerName="nova-cell1-conductor-conductor" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.228593 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xlzz4" event={"ID":"76a514a0-0d4c-4f6b-8ba7-cd5b4834d625","Type":"ContainerDied","Data":"2d86ae89ca9eacf088e2483fdc8b7fc27d54e8d4395301cbc4733433a8296bf1"} Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.228635 4919 scope.go:117] "RemoveContainer" containerID="3f184ab546adc93a6838bca186ee1a27fb82249fb7be1e6ac56b84aa0dcb13c5" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.228696 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xlzz4" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.235664 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b/ovn-northd/0.log" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.235816 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b","Type":"ContainerDied","Data":"b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a"} Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.235825 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.235781 4919 generic.go:334] "Generic (PLEG): container finished" podID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerID="b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a" exitCode=139 Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.244118 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b","Type":"ContainerDied","Data":"7eb0e5753f33d6c038d98c271a29b6f581084fd55a480e34dff0747a3ff91a53"} Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.256626 4919 scope.go:117] "RemoveContainer" containerID="95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.290048 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.293295 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.303187 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xlzz4"] Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.305319 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xlzz4"] Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.305652 4919 scope.go:117] "RemoveContainer" containerID="b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a" Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.360921 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.366659 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.370153 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.373788 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.376370 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.376414 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.377629 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.377659 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.387205 4919 scope.go:117] "RemoveContainer" containerID="95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b" Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.387629 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b\": container with ID starting with 95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b not found: ID does not exist" containerID="95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.387658 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b"} err="failed to get container status \"95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b\": rpc error: code = NotFound desc = could not find container \"95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b\": container with ID starting with 95aec50666a5cbb9eb2fc08bcc44915e765c29007dcbf5a2bca002bcee7be03b not found: ID does not exist" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.387678 4919 scope.go:117] "RemoveContainer" containerID="b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a" Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.387867 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a\": container with ID starting with b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a not found: ID does not exist" containerID="b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.387883 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a"} err="failed to get container status \"b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a\": rpc error: code = NotFound desc = could not find container \"b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a\": container with ID starting with b0faad1e090c5549c73d125d875997379a03b88c3b976099f2f3aa7ea1f1ca9a not found: ID does not exist" Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.484516 4919 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:50 crc kubenswrapper[4919]: E0310 22:15:50.484590 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data podName:3fe05756-9202-4514-8eea-0c786a2b6d56 nodeName:}" failed. No retries permitted until 2026-03-10 22:15:58.484570388 +0000 UTC m=+1545.726450996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data") pod "rabbitmq-cell1-server-0" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56") : configmap "rabbitmq-cell1-config-data" not found Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.665092 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.788640 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fbfnm" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" probeResult="failure" output="command timed out" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.844734 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.854283 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.855605 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fbfnm" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 22:15:50 crc kubenswrapper[4919]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 10 22:15:50 crc kubenswrapper[4919]: > Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.904055 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.951030 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.994804 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4p55\" (UniqueName: \"kubernetes.io/projected/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-kube-api-access-g4p55\") pod \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.994867 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-confd\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.994901 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-fernet-keys\") pod \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.994920 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-galera-tls-certs\") pod \"9372011b-416f-484d-a873-fdda67baf9fe\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.994986 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9372011b-416f-484d-a873-fdda67baf9fe-config-data-generated\") pod \"9372011b-416f-484d-a873-fdda67baf9fe\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995014 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-internal-tls-certs\") pod \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995036 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-combined-ca-bundle\") pod \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995057 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-tls\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995097 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa3e6892-7a97-4563-b339-6c3acfd36dd3-pod-info\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995121 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6qtw\" (UniqueName: \"kubernetes.io/projected/9372011b-416f-484d-a873-fdda67baf9fe-kube-api-access-f6qtw\") pod \"9372011b-416f-484d-a873-fdda67baf9fe\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995152 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-erlang-cookie\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995187 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-config-data\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995214 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-kolla-config\") pod \"9372011b-416f-484d-a873-fdda67baf9fe\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995238 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-combined-ca-bundle\") pod \"9372011b-416f-484d-a873-fdda67baf9fe\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995275 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-config-data-default\") pod \"9372011b-416f-484d-a873-fdda67baf9fe\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995306 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-operator-scripts\") pod \"9372011b-416f-484d-a873-fdda67baf9fe\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995347 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-credential-keys\") pod \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.995379 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-public-tls-certs\") pod \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996189 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-server-conf\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996226 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-plugins-conf\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996263 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-config-data\") pod \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996303 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-plugins\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996330 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa3e6892-7a97-4563-b339-6c3acfd36dd3-erlang-cookie-secret\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996353 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-scripts\") pod \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\" (UID: \"408722a8-2c8a-4bda-82d5-1d2f58bda7d7\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996374 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996423 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9rjm\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-kube-api-access-r9rjm\") pod \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\" (UID: \"fa3e6892-7a97-4563-b339-6c3acfd36dd3\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.996443 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9372011b-416f-484d-a873-fdda67baf9fe\" (UID: \"9372011b-416f-484d-a873-fdda67baf9fe\") " Mar 10 22:15:50 crc kubenswrapper[4919]: I0310 22:15:50.998330 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.001256 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fa3e6892-7a97-4563-b339-6c3acfd36dd3-pod-info" (OuterVolumeSpecName: "pod-info") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.001667 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9372011b-416f-484d-a873-fdda67baf9fe-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9372011b-416f-484d-a873-fdda67baf9fe" (UID: "9372011b-416f-484d-a873-fdda67baf9fe"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.002105 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.003320 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9372011b-416f-484d-a873-fdda67baf9fe" (UID: "9372011b-416f-484d-a873-fdda67baf9fe"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.004528 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "408722a8-2c8a-4bda-82d5-1d2f58bda7d7" (UID: "408722a8-2c8a-4bda-82d5-1d2f58bda7d7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.010452 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9372011b-416f-484d-a873-fdda67baf9fe" (UID: "9372011b-416f-484d-a873-fdda67baf9fe"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.010984 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9372011b-416f-484d-a873-fdda67baf9fe" (UID: "9372011b-416f-484d-a873-fdda67baf9fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.011321 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-scripts" (OuterVolumeSpecName: "scripts") pod "408722a8-2c8a-4bda-82d5-1d2f58bda7d7" (UID: "408722a8-2c8a-4bda-82d5-1d2f58bda7d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.011614 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.012736 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "408722a8-2c8a-4bda-82d5-1d2f58bda7d7" (UID: "408722a8-2c8a-4bda-82d5-1d2f58bda7d7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.012750 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.013861 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "9372011b-416f-484d-a873-fdda67baf9fe" (UID: "9372011b-416f-484d-a873-fdda67baf9fe"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.014236 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-kube-api-access-r9rjm" (OuterVolumeSpecName: "kube-api-access-r9rjm") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "kube-api-access-r9rjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.014645 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3e6892-7a97-4563-b339-6c3acfd36dd3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.019104 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.019500 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-kube-api-access-g4p55" (OuterVolumeSpecName: "kube-api-access-g4p55") pod "408722a8-2c8a-4bda-82d5-1d2f58bda7d7" (UID: "408722a8-2c8a-4bda-82d5-1d2f58bda7d7"). InnerVolumeSpecName "kube-api-access-g4p55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.019970 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9372011b-416f-484d-a873-fdda67baf9fe-kube-api-access-f6qtw" (OuterVolumeSpecName: "kube-api-access-f6qtw") pod "9372011b-416f-484d-a873-fdda67baf9fe" (UID: "9372011b-416f-484d-a873-fdda67baf9fe"). InnerVolumeSpecName "kube-api-access-f6qtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.043895 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "408722a8-2c8a-4bda-82d5-1d2f58bda7d7" (UID: "408722a8-2c8a-4bda-82d5-1d2f58bda7d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.049303 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-config-data" (OuterVolumeSpecName: "config-data") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.050008 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-server-conf" (OuterVolumeSpecName: "server-conf") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.051617 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-config-data" (OuterVolumeSpecName: "config-data") pod "408722a8-2c8a-4bda-82d5-1d2f58bda7d7" (UID: "408722a8-2c8a-4bda-82d5-1d2f58bda7d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.052258 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9372011b-416f-484d-a873-fdda67baf9fe" (UID: "9372011b-416f-484d-a873-fdda67baf9fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.063729 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9372011b-416f-484d-a873-fdda67baf9fe" (UID: "9372011b-416f-484d-a873-fdda67baf9fe"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.068842 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "408722a8-2c8a-4bda-82d5-1d2f58bda7d7" (UID: "408722a8-2c8a-4bda-82d5-1d2f58bda7d7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.076833 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "408722a8-2c8a-4bda-82d5-1d2f58bda7d7" (UID: "408722a8-2c8a-4bda-82d5-1d2f58bda7d7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.082959 4919 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 10 22:15:51 crc kubenswrapper[4919]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-10T22:15:43Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 22:15:51 crc kubenswrapper[4919]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 22:15:51 crc kubenswrapper[4919]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-fbfnm" message=< Mar 10 22:15:51 crc kubenswrapper[4919]: Exiting ovn-controller (1) [FAILED] Mar 10 22:15:51 crc kubenswrapper[4919]: Killing ovn-controller (1) [ OK ] Mar 10 22:15:51 crc kubenswrapper[4919]: Killing ovn-controller (1) with SIGKILL [ OK ] Mar 10 22:15:51 crc kubenswrapper[4919]: 2026-03-10T22:15:43Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 22:15:51 crc kubenswrapper[4919]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 22:15:51 crc kubenswrapper[4919]: > Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.082995 4919 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 10 22:15:51 crc kubenswrapper[4919]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-10T22:15:43Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 22:15:51 crc kubenswrapper[4919]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 22:15:51 crc kubenswrapper[4919]: > pod="openstack/ovn-controller-fbfnm" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" containerID="cri-o://f38ac54b5abf8ebe29460d44b16de61bf12705b9f6a4ac5d48ab1694b6482b7e" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.083032 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-fbfnm" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" containerID="cri-o://f38ac54b5abf8ebe29460d44b16de61bf12705b9f6a4ac5d48ab1694b6482b7e" gracePeriod=22 Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098063 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-tls\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098146 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54x5x\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-kube-api-access-54x5x\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098180 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-confd\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098236 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-erlang-cookie\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098263 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fe05756-9202-4514-8eea-0c786a2b6d56-pod-info\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098325 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098348 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-plugins\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098376 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098436 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fe05756-9202-4514-8eea-0c786a2b6d56-erlang-cookie-secret\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098460 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-plugins-conf\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098508 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-server-conf\") pod \"3fe05756-9202-4514-8eea-0c786a2b6d56\" (UID: \"3fe05756-9202-4514-8eea-0c786a2b6d56\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098852 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098871 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098883 4919 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa3e6892-7a97-4563-b339-6c3acfd36dd3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098895 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098917 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098929 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9rjm\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-kube-api-access-r9rjm\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098946 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098957 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4p55\" (UniqueName: \"kubernetes.io/projected/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-kube-api-access-g4p55\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098968 4919 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098979 4919 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.098990 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9372011b-416f-484d-a873-fdda67baf9fe-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099001 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099012 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099022 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099032 4919 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa3e6892-7a97-4563-b339-6c3acfd36dd3-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099047 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6qtw\" (UniqueName: \"kubernetes.io/projected/9372011b-416f-484d-a873-fdda67baf9fe-kube-api-access-f6qtw\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099058 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099070 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099080 4919 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099092 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9372011b-416f-484d-a873-fdda67baf9fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099103 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099500 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9372011b-416f-484d-a873-fdda67baf9fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099513 4919 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099527 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/408722a8-2c8a-4bda-82d5-1d2f58bda7d7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099539 4919 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099549 4919 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa3e6892-7a97-4563-b339-6c3acfd36dd3-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.099737 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.101474 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.102369 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.104822 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-kube-api-access-54x5x" (OuterVolumeSpecName: "kube-api-access-54x5x") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "kube-api-access-54x5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.106089 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3fe05756-9202-4514-8eea-0c786a2b6d56-pod-info" (OuterVolumeSpecName: "pod-info") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.107014 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.115534 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe05756-9202-4514-8eea-0c786a2b6d56-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.119201 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "fa3e6892-7a97-4563-b339-6c3acfd36dd3" (UID: "fa3e6892-7a97-4563-b339-6c3acfd36dd3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.131301 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.135533 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.138768 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.145180 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data" (OuterVolumeSpecName: "config-data") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202410 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202434 4919 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fe05756-9202-4514-8eea-0c786a2b6d56-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202461 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202471 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202479 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202487 4919 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fe05756-9202-4514-8eea-0c786a2b6d56-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202495 4919 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202503 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202512 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202520 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa3e6892-7a97-4563-b339-6c3acfd36dd3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202528 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.202536 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54x5x\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-kube-api-access-54x5x\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.203209 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-server-conf" (OuterVolumeSpecName: "server-conf") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.216080 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.223983 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3fe05756-9202-4514-8eea-0c786a2b6d56" (UID: "3fe05756-9202-4514-8eea-0c786a2b6d56"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.266976 4919 generic.go:334] "Generic (PLEG): container finished" podID="408722a8-2c8a-4bda-82d5-1d2f58bda7d7" containerID="d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7" exitCode=0 Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.267104 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-599f4d795-pgnpd" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.267605 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-599f4d795-pgnpd" event={"ID":"408722a8-2c8a-4bda-82d5-1d2f58bda7d7","Type":"ContainerDied","Data":"d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.267685 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-599f4d795-pgnpd" event={"ID":"408722a8-2c8a-4bda-82d5-1d2f58bda7d7","Type":"ContainerDied","Data":"5ff781be5fb816909a57c89be66125e402a38480598bb68dfb13293aa870a6de"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.267705 4919 scope.go:117] "RemoveContainer" containerID="d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.270736 4919 generic.go:334] "Generic (PLEG): container finished" podID="9372011b-416f-484d-a873-fdda67baf9fe" containerID="c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4" exitCode=0 Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.270947 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.271288 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9372011b-416f-484d-a873-fdda67baf9fe","Type":"ContainerDied","Data":"c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.271350 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9372011b-416f-484d-a873-fdda67baf9fe","Type":"ContainerDied","Data":"a81aced24f8ba96209b6fd098a3080c41e122cc2baa8d4abc6cd37a16a08dc96"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.303830 4919 generic.go:334] "Generic (PLEG): container finished" podID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" containerID="8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d" exitCode=0 Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.303903 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa3e6892-7a97-4563-b339-6c3acfd36dd3","Type":"ContainerDied","Data":"8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.303934 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa3e6892-7a97-4563-b339-6c3acfd36dd3","Type":"ContainerDied","Data":"42e225731776032e5374a48a26b6d33c41f6b89f012d9128261ba5561d589429"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.304026 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.304408 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fe05756-9202-4514-8eea-0c786a2b6d56-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.304790 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.304833 4919 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fe05756-9202-4514-8eea-0c786a2b6d56-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.309269 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fbfnm_783e3f3a-7a6f-4b95-a7d2-6988c8a6149b/ovn-controller/0.log" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.309351 4919 generic.go:334] "Generic (PLEG): container finished" podID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerID="f38ac54b5abf8ebe29460d44b16de61bf12705b9f6a4ac5d48ab1694b6482b7e" exitCode=137 Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.309451 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fbfnm" event={"ID":"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b","Type":"ContainerDied","Data":"f38ac54b5abf8ebe29460d44b16de61bf12705b9f6a4ac5d48ab1694b6482b7e"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.313514 4919 generic.go:334] "Generic (PLEG): container finished" podID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerID="574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913" exitCode=0 Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.313558 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fe05756-9202-4514-8eea-0c786a2b6d56","Type":"ContainerDied","Data":"574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.313584 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fe05756-9202-4514-8eea-0c786a2b6d56","Type":"ContainerDied","Data":"13e0fb5aa9484b55f86cc7cbbfe20d7f1e22a61a4af9e0dc5e34c9452254e23c"} Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.313665 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.360557 4919 scope.go:117] "RemoveContainer" containerID="d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.362950 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7\": container with ID starting with d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7 not found: ID does not exist" containerID="d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.362995 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7"} err="failed to get container status \"d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7\": rpc error: code = NotFound desc = could not find container \"d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7\": container with ID starting with d441cb2cbe08ef1a248f7014e5b13a5f2346dcbbe5baae3176348c48f4842be7 not found: ID does not exist" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.363025 4919 scope.go:117] "RemoveContainer" containerID="c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.388665 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.394009 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.401574 4919 scope.go:117] "RemoveContainer" containerID="9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.406985 4919 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.407151 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:59.407121863 +0000 UTC m=+1546.649002471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-config" not found Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.407670 4919 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.407767 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:15:59.40775835 +0000 UTC m=+1546.649638958 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-httpd-config" not found Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.411461 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-599f4d795-pgnpd"] Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.420142 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-599f4d795-pgnpd"] Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.428130 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fbfnm_783e3f3a-7a6f-4b95-a7d2-6988c8a6149b/ovn-controller/0.log" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.428351 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fbfnm" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.432865 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.440758 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.446187 4919 scope.go:117] "RemoveContainer" containerID="c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.446769 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4\": container with ID starting with c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4 not found: ID does not exist" containerID="c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.446809 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4"} err="failed to get container status \"c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4\": rpc error: code = NotFound desc = could not find container \"c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4\": container with ID starting with c4f0d5a04934f6107a3721bf5a429219c7956700786a5ddf3b089b5208e91ed4 not found: ID does not exist" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.446835 4919 scope.go:117] "RemoveContainer" containerID="9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.447213 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c\": container with ID starting with 9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c not found: ID does not exist" containerID="9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.447240 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c"} err="failed to get container status \"9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c\": rpc error: code = NotFound desc = could not find container \"9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c\": container with ID starting with 9ceea61b15efc0e45f1d8d250401b28f29cdb09cb74682cd8386f87edef5f74c not found: ID does not exist" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.447257 4919 scope.go:117] "RemoveContainer" containerID="8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.447418 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.461736 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.499559 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008d3aa2-636e-48bc-a09a-00541bc3bd5e" path="/var/lib/kubelet/pods/008d3aa2-636e-48bc-a09a-00541bc3bd5e/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.500050 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c34af84-e5e2-4219-b7b2-bf1e2c2a731b" path="/var/lib/kubelet/pods/1c34af84-e5e2-4219-b7b2-bf1e2c2a731b/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.500554 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f324194-64d5-4755-847b-f554b94e652c" path="/var/lib/kubelet/pods/1f324194-64d5-4755-847b-f554b94e652c/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.501721 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" path="/var/lib/kubelet/pods/31690f34-6b68-4470-a13e-e16121ec25d2/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.503348 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fe05756-9202-4514-8eea-0c786a2b6d56" path="/var/lib/kubelet/pods/3fe05756-9202-4514-8eea-0c786a2b6d56/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.504054 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408722a8-2c8a-4bda-82d5-1d2f58bda7d7" path="/var/lib/kubelet/pods/408722a8-2c8a-4bda-82d5-1d2f58bda7d7/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.505100 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" path="/var/lib/kubelet/pods/4865c8ed-670d-41a0-b9fc-ba7697085e6b/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.505971 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515105ef-e538-4276-b682-7e05881dc7e8" path="/var/lib/kubelet/pods/515105ef-e538-4276-b682-7e05881dc7e8/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.506691 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" path="/var/lib/kubelet/pods/76a514a0-0d4c-4f6b-8ba7-cd5b4834d625/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.508250 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bfe342f-267a-4239-a9cc-8df0e3d14a92" path="/var/lib/kubelet/pods/7bfe342f-267a-4239-a9cc-8df0e3d14a92/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.508821 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-log-ovn\") pod \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.508919 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-combined-ca-bundle\") pod \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.508907 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" (UID: "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.508967 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run-ovn\") pod \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.509004 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run\") pod \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.509025 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq5jc\" (UniqueName: \"kubernetes.io/projected/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-kube-api-access-lq5jc\") pod \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.509052 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-ovn-controller-tls-certs\") pod \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.509129 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-scripts\") pod \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\" (UID: \"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b\") " Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.509456 4919 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.509501 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run" (OuterVolumeSpecName: "var-run") pod "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" (UID: "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.509629 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" (UID: "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.510747 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-scripts" (OuterVolumeSpecName: "scripts") pod "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" (UID: "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.510943 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81489e39-0246-4065-8835-31b1e5da8431" path="/var/lib/kubelet/pods/81489e39-0246-4065-8835-31b1e5da8431/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.511850 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" path="/var/lib/kubelet/pods/91a933f1-aa44-4375-8f5c-e5f3567e6c8e/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.512341 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-kube-api-access-lq5jc" (OuterVolumeSpecName: "kube-api-access-lq5jc") pod "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" (UID: "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b"). InnerVolumeSpecName "kube-api-access-lq5jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.513513 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9372011b-416f-484d-a873-fdda67baf9fe" path="/var/lib/kubelet/pods/9372011b-416f-484d-a873-fdda67baf9fe/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.514160 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" path="/var/lib/kubelet/pods/981bb03c-23be-4bf8-a9f6-cb8a552f66a5/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.514795 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01f4397-9fee-4ff4-af76-ed0b37f04b28" path="/var/lib/kubelet/pods/a01f4397-9fee-4ff4-af76-ed0b37f04b28/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.515823 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a88061-cba8-4535-bf01-5285d8cbb79f" path="/var/lib/kubelet/pods/a4a88061-cba8-4535-bf01-5285d8cbb79f/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.516448 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" path="/var/lib/kubelet/pods/ab479995-b87a-46b8-9a4e-d9e95d556775/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.517212 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" path="/var/lib/kubelet/pods/f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.518443 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" path="/var/lib/kubelet/pods/fa3e6892-7a97-4563-b339-6c3acfd36dd3/volumes" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.537796 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" (UID: "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.571770 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" (UID: "783e3f3a-7a6f-4b95-a7d2-6988c8a6149b"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.612058 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.612090 4919 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.612103 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq5jc\" (UniqueName: \"kubernetes.io/projected/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-kube-api-access-lq5jc\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.612115 4919 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.612125 4919 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.612135 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.816377 4919 scope.go:117] "RemoveContainer" containerID="1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.837253 4919 scope.go:117] "RemoveContainer" containerID="8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.837737 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d\": container with ID starting with 8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d not found: ID does not exist" containerID="8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.837766 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d"} err="failed to get container status \"8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d\": rpc error: code = NotFound desc = could not find container \"8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d\": container with ID starting with 8e9a7cee8d15c0ec29a2604cb6af26be2d7540dda5209902519b1a0222c5362d not found: ID does not exist" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.837786 4919 scope.go:117] "RemoveContainer" containerID="1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.838856 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3\": container with ID starting with 1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3 not found: ID does not exist" containerID="1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.838883 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3"} err="failed to get container status \"1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3\": rpc error: code = NotFound desc = could not find container \"1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3\": container with ID starting with 1ed5abf42f687ad1c4876f258add313618dad5f265e35efc4895ebc955fec9a3 not found: ID does not exist" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.838896 4919 scope.go:117] "RemoveContainer" containerID="574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.854105 4919 scope.go:117] "RemoveContainer" containerID="751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.880848 4919 scope.go:117] "RemoveContainer" containerID="574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.882779 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913\": container with ID starting with 574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913 not found: ID does not exist" containerID="574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.882808 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913"} err="failed to get container status \"574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913\": rpc error: code = NotFound desc = could not find container \"574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913\": container with ID starting with 574989ae30539b8ab9b813ac1ebeaa0a635b60aa0d2eba085c37e336b3216913 not found: ID does not exist" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.882828 4919 scope.go:117] "RemoveContainer" containerID="751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99" Mar 10 22:15:51 crc kubenswrapper[4919]: E0310 22:15:51.883093 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99\": container with ID starting with 751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99 not found: ID does not exist" containerID="751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99" Mar 10 22:15:51 crc kubenswrapper[4919]: I0310 22:15:51.883110 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99"} err="failed to get container status \"751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99\": rpc error: code = NotFound desc = could not find container \"751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99\": container with ID starting with 751af40a46c32202c740dcad6ce5d333888d6711ba4fa0cefd841e26d404db99 not found: ID does not exist" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.023893 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.096229 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.107901 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.118157 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-config-data\") pod \"4a7ad3ed-9144-4a21-808c-23d613354a2f\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.118290 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdfp5\" (UniqueName: \"kubernetes.io/projected/4a7ad3ed-9144-4a21-808c-23d613354a2f-kube-api-access-sdfp5\") pod \"4a7ad3ed-9144-4a21-808c-23d613354a2f\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.118343 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-combined-ca-bundle\") pod \"4a7ad3ed-9144-4a21-808c-23d613354a2f\" (UID: \"4a7ad3ed-9144-4a21-808c-23d613354a2f\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.125565 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7ad3ed-9144-4a21-808c-23d613354a2f-kube-api-access-sdfp5" (OuterVolumeSpecName: "kube-api-access-sdfp5") pod "4a7ad3ed-9144-4a21-808c-23d613354a2f" (UID: "4a7ad3ed-9144-4a21-808c-23d613354a2f"). InnerVolumeSpecName "kube-api-access-sdfp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.143738 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a7ad3ed-9144-4a21-808c-23d613354a2f" (UID: "4a7ad3ed-9144-4a21-808c-23d613354a2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.144558 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-config-data" (OuterVolumeSpecName: "config-data") pod "4a7ad3ed-9144-4a21-808c-23d613354a2f" (UID: "4a7ad3ed-9144-4a21-808c-23d613354a2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.219965 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm9xr\" (UniqueName: \"kubernetes.io/projected/9700fb27-6a74-428d-a2e6-71c237b3e054-kube-api-access-lm9xr\") pod \"9700fb27-6a74-428d-a2e6-71c237b3e054\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220021 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data\") pod \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220079 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-config-data\") pod \"9700fb27-6a74-428d-a2e6-71c237b3e054\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220106 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-combined-ca-bundle\") pod \"9700fb27-6a74-428d-a2e6-71c237b3e054\" (UID: \"9700fb27-6a74-428d-a2e6-71c237b3e054\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220140 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data-custom\") pod \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220167 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-combined-ca-bundle\") pod \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220199 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-scripts\") pod \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220225 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-etc-machine-id\") pod \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220292 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c282s\" (UniqueName: \"kubernetes.io/projected/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-kube-api-access-c282s\") pod \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\" (UID: \"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220686 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdfp5\" (UniqueName: \"kubernetes.io/projected/4a7ad3ed-9144-4a21-808c-23d613354a2f-kube-api-access-sdfp5\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220703 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.220716 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7ad3ed-9144-4a21-808c-23d613354a2f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.221054 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" (UID: "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.223281 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9700fb27-6a74-428d-a2e6-71c237b3e054-kube-api-access-lm9xr" (OuterVolumeSpecName: "kube-api-access-lm9xr") pod "9700fb27-6a74-428d-a2e6-71c237b3e054" (UID: "9700fb27-6a74-428d-a2e6-71c237b3e054"). InnerVolumeSpecName "kube-api-access-lm9xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.223412 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" (UID: "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.223682 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-kube-api-access-c282s" (OuterVolumeSpecName: "kube-api-access-c282s") pod "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" (UID: "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145"). InnerVolumeSpecName "kube-api-access-c282s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.223712 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-scripts" (OuterVolumeSpecName: "scripts") pod "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" (UID: "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.240178 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-config-data" (OuterVolumeSpecName: "config-data") pod "9700fb27-6a74-428d-a2e6-71c237b3e054" (UID: "9700fb27-6a74-428d-a2e6-71c237b3e054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.251730 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9700fb27-6a74-428d-a2e6-71c237b3e054" (UID: "9700fb27-6a74-428d-a2e6-71c237b3e054"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.264159 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" (UID: "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.321741 4919 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.322882 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c282s\" (UniqueName: \"kubernetes.io/projected/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-kube-api-access-c282s\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.323022 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm9xr\" (UniqueName: \"kubernetes.io/projected/9700fb27-6a74-428d-a2e6-71c237b3e054-kube-api-access-lm9xr\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.323104 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.323197 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9700fb27-6a74-428d-a2e6-71c237b3e054-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.323294 4919 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.323353 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.323430 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.324938 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data" (OuterVolumeSpecName: "config-data") pod "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" (UID: "b1f5a3b8-c9ca-403a-aecf-f6fbf286b145"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.326839 4919 generic.go:334] "Generic (PLEG): container finished" podID="9700fb27-6a74-428d-a2e6-71c237b3e054" containerID="8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5" exitCode=0 Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.326885 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9700fb27-6a74-428d-a2e6-71c237b3e054","Type":"ContainerDied","Data":"8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5"} Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.326909 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9700fb27-6a74-428d-a2e6-71c237b3e054","Type":"ContainerDied","Data":"54964dbcb0afc9490d2734ef68d0cbd04f49fcea1dc31452b540f5a029cb55be"} Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.326927 4919 scope.go:117] "RemoveContainer" containerID="8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.327009 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.331676 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.337803 4919 generic.go:334] "Generic (PLEG): container finished" podID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerID="8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923" exitCode=0 Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.337961 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145","Type":"ContainerDied","Data":"8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923"} Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.337993 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1f5a3b8-c9ca-403a-aecf-f6fbf286b145","Type":"ContainerDied","Data":"6bf1a770ba875694b09781b4028dc4d2d064955bd543ca5b52b1039d941ab946"} Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.338187 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.340643 4919 generic.go:334] "Generic (PLEG): container finished" podID="4a7ad3ed-9144-4a21-808c-23d613354a2f" containerID="9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4" exitCode=0 Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.340697 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4a7ad3ed-9144-4a21-808c-23d613354a2f","Type":"ContainerDied","Data":"9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4"} Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.340740 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4a7ad3ed-9144-4a21-808c-23d613354a2f","Type":"ContainerDied","Data":"3c1cec38dad6b500fe1db67b824dc4b2fa037a697e7256d18f0d6bb402e4e332"} Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.340913 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.343216 4919 generic.go:334] "Generic (PLEG): container finished" podID="62814b8d-8679-4350-be7d-5f729f901846" containerID="ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" exitCode=0 Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.343264 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"62814b8d-8679-4350-be7d-5f729f901846","Type":"ContainerDied","Data":"ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6"} Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.343323 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.345682 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fbfnm_783e3f3a-7a6f-4b95-a7d2-6988c8a6149b/ovn-controller/0.log" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.345744 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fbfnm" event={"ID":"783e3f3a-7a6f-4b95-a7d2-6988c8a6149b","Type":"ContainerDied","Data":"3bc851905a66630d9ff057ffa31ad0069018b7e623a4e77511edc9d62258d0db"} Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.345819 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fbfnm" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.394054 4919 scope.go:117] "RemoveContainer" containerID="8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5" Mar 10 22:15:52 crc kubenswrapper[4919]: E0310 22:15:52.395086 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5\": container with ID starting with 8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5 not found: ID does not exist" containerID="8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.395124 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5"} err="failed to get container status \"8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5\": rpc error: code = NotFound desc = could not find container \"8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5\": container with ID starting with 8bc20ce51b10a668d26fcfd7ec96ed2a288dcdabfde25fcc336eb1a622b6f4e5 not found: ID does not exist" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.395176 4919 scope.go:117] "RemoveContainer" containerID="3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.398830 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.408446 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.424373 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbrmd\" (UniqueName: \"kubernetes.io/projected/62814b8d-8679-4350-be7d-5f729f901846-kube-api-access-pbrmd\") pod \"62814b8d-8679-4350-be7d-5f729f901846\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.424488 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-combined-ca-bundle\") pod \"62814b8d-8679-4350-be7d-5f729f901846\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.424530 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-config-data\") pod \"62814b8d-8679-4350-be7d-5f729f901846\" (UID: \"62814b8d-8679-4350-be7d-5f729f901846\") " Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.424906 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.431344 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62814b8d-8679-4350-be7d-5f729f901846-kube-api-access-pbrmd" (OuterVolumeSpecName: "kube-api-access-pbrmd") pod "62814b8d-8679-4350-be7d-5f729f901846" (UID: "62814b8d-8679-4350-be7d-5f729f901846"). InnerVolumeSpecName "kube-api-access-pbrmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.439928 4919 scope.go:117] "RemoveContainer" containerID="8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.440990 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.448741 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.450534 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-config-data" (OuterVolumeSpecName: "config-data") pod "62814b8d-8679-4350-be7d-5f729f901846" (UID: "62814b8d-8679-4350-be7d-5f729f901846"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.454975 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fbfnm"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.459765 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fbfnm"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.462689 4919 scope.go:117] "RemoveContainer" containerID="3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846" Mar 10 22:15:52 crc kubenswrapper[4919]: E0310 22:15:52.463051 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846\": container with ID starting with 3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846 not found: ID does not exist" containerID="3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.463175 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846"} err="failed to get container status \"3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846\": rpc error: code = NotFound desc = could not find container \"3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846\": container with ID starting with 3b0a6033d190b7300fe815ac8e78922471798cec143251073885cde5a79cf846 not found: ID does not exist" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.463255 4919 scope.go:117] "RemoveContainer" containerID="8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923" Mar 10 22:15:52 crc kubenswrapper[4919]: E0310 22:15:52.463918 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923\": container with ID starting with 8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923 not found: ID does not exist" containerID="8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.463959 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923"} err="failed to get container status \"8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923\": rpc error: code = NotFound desc = could not find container \"8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923\": container with ID starting with 8788e8f4a8fdff775edb373c25584b90721c4e93529ebfa7f6ee7f0858b36923 not found: ID does not exist" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.463986 4919 scope.go:117] "RemoveContainer" containerID="9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.465105 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.478729 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.479456 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62814b8d-8679-4350-be7d-5f729f901846" (UID: "62814b8d-8679-4350-be7d-5f729f901846"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.526416 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.526455 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbrmd\" (UniqueName: \"kubernetes.io/projected/62814b8d-8679-4350-be7d-5f729f901846-kube-api-access-pbrmd\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.526467 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62814b8d-8679-4350-be7d-5f729f901846-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.563078 4919 scope.go:117] "RemoveContainer" containerID="9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4" Mar 10 22:15:52 crc kubenswrapper[4919]: E0310 22:15:52.563626 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4\": container with ID starting with 9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4 not found: ID does not exist" containerID="9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.563764 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4"} err="failed to get container status \"9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4\": rpc error: code = NotFound desc = could not find container \"9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4\": container with ID starting with 9a7f54f0ad1bc99653d56471ca107558d95729ca0a75e6040163ca4e8d3452b4 not found: ID does not exist" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.563892 4919 scope.go:117] "RemoveContainer" containerID="ba1ede56006ea1128e8e67460a4bb03bb7a7ac205f92a9ada4f61f419402b0a6" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.581145 4919 scope.go:117] "RemoveContainer" containerID="f38ac54b5abf8ebe29460d44b16de61bf12705b9f6a4ac5d48ab1694b6482b7e" Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.669254 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 22:15:52 crc kubenswrapper[4919]: I0310 22:15:52.676600 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 22:15:53 crc kubenswrapper[4919]: I0310 22:15:53.411474 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.173:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 22:15:53 crc kubenswrapper[4919]: I0310 22:15:53.493914 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7ad3ed-9144-4a21-808c-23d613354a2f" path="/var/lib/kubelet/pods/4a7ad3ed-9144-4a21-808c-23d613354a2f/volumes" Mar 10 22:15:53 crc kubenswrapper[4919]: I0310 22:15:53.494579 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62814b8d-8679-4350-be7d-5f729f901846" path="/var/lib/kubelet/pods/62814b8d-8679-4350-be7d-5f729f901846/volumes" Mar 10 22:15:53 crc kubenswrapper[4919]: I0310 22:15:53.495049 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" path="/var/lib/kubelet/pods/783e3f3a-7a6f-4b95-a7d2-6988c8a6149b/volumes" Mar 10 22:15:53 crc kubenswrapper[4919]: I0310 22:15:53.496000 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9700fb27-6a74-428d-a2e6-71c237b3e054" path="/var/lib/kubelet/pods/9700fb27-6a74-428d-a2e6-71c237b3e054/volumes" Mar 10 22:15:53 crc kubenswrapper[4919]: I0310 22:15:53.496666 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" path="/var/lib/kubelet/pods/b1f5a3b8-c9ca-403a-aecf-f6fbf286b145/volumes" Mar 10 22:15:53 crc kubenswrapper[4919]: I0310 22:15:53.827216 4919 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="a4a88061-cba8-4535-bf01-5285d8cbb79f" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.108:11211: i/o timeout" Mar 10 22:15:55 crc kubenswrapper[4919]: E0310 22:15:55.358776 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:55 crc kubenswrapper[4919]: E0310 22:15:55.359259 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:55 crc kubenswrapper[4919]: E0310 22:15:55.359574 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:15:55 crc kubenswrapper[4919]: E0310 22:15:55.359605 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:15:55 crc kubenswrapper[4919]: E0310 22:15:55.360027 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:55 crc kubenswrapper[4919]: E0310 22:15:55.361174 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:55 crc kubenswrapper[4919]: E0310 22:15:55.362174 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:15:55 crc kubenswrapper[4919]: E0310 22:15:55.362240 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.176159 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.176931 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.177487 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.178709 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.178822 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" gracePeriod=600 Mar 10 22:15:59 crc kubenswrapper[4919]: E0310 22:15:59.304676 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.414762 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" exitCode=0 Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.414812 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637"} Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.414877 4919 scope.go:117] "RemoveContainer" containerID="1dccae4c12e9eba18bc8d7756e50538a70d75c0bc02ce7c79c284d496783301e" Mar 10 22:15:59 crc kubenswrapper[4919]: I0310 22:15:59.415765 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:15:59 crc kubenswrapper[4919]: E0310 22:15:59.416166 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:15:59 crc kubenswrapper[4919]: E0310 22:15:59.443137 4919 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Mar 10 22:15:59 crc kubenswrapper[4919]: E0310 22:15:59.443199 4919 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Mar 10 22:15:59 crc kubenswrapper[4919]: E0310 22:15:59.443242 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:16:15.44322072 +0000 UTC m=+1562.685101328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-config" not found Mar 10 22:15:59 crc kubenswrapper[4919]: E0310 22:15:59.444239 4919 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config podName:0a44bcbb-6e2e-48bb-b7a7-16a4e916001d nodeName:}" failed. No retries permitted until 2026-03-10 22:16:15.444222008 +0000 UTC m=+1562.686102626 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config") pod "neutron-846dbc6cd5-kg4kx" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d") : secret "neutron-httpd-config" not found Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142155 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553016-d8blk"] Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142514 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142529 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-log" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142546 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="probe" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142554 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="probe" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142565 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerName="placement-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142574 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerName="placement-log" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142589 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142596 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-api" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142613 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" containerName="mariadb-account-create-update" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142620 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" containerName="mariadb-account-create-update" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142632 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="ovn-northd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142639 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="ovn-northd" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142649 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bfe342f-267a-4239-a9cc-8df0e3d14a92" containerName="kube-state-metrics" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142656 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bfe342f-267a-4239-a9cc-8df0e3d14a92" containerName="kube-state-metrics" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142671 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" containerName="setup-container" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142678 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" containerName="setup-container" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142691 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9372011b-416f-484d-a873-fdda67baf9fe" containerName="galera" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142699 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9372011b-416f-484d-a873-fdda67baf9fe" containerName="galera" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142708 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="ceilometer-notification-agent" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142716 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="ceilometer-notification-agent" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142732 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-server" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142739 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-server" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142752 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerName="setup-container" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142759 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerName="setup-container" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142776 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142784 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142795 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" containerName="mariadb-account-create-update" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142803 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" containerName="mariadb-account-create-update" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142818 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="proxy-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142826 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="proxy-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142836 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="cinder-scheduler" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142843 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="cinder-scheduler" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142855 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7ad3ed-9144-4a21-808c-23d613354a2f" containerName="nova-cell0-conductor-conductor" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142863 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7ad3ed-9144-4a21-808c-23d613354a2f" containerName="nova-cell0-conductor-conductor" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142871 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="sg-core" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142878 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="sg-core" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142889 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" containerName="rabbitmq" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142898 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" containerName="rabbitmq" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142910 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9700fb27-6a74-428d-a2e6-71c237b3e054" containerName="nova-scheduler-scheduler" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142917 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9700fb27-6a74-428d-a2e6-71c237b3e054" containerName="nova-scheduler-scheduler" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142927 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerName="glance-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142934 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerName="glance-log" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142948 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62814b8d-8679-4350-be7d-5f729f901846" containerName="nova-cell1-conductor-conductor" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142956 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="62814b8d-8679-4350-be7d-5f729f901846" containerName="nova-cell1-conductor-conductor" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142967 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="ceilometer-central-agent" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142975 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="ceilometer-central-agent" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.142985 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.142993 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143006 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" containerName="barbican-keystone-listener-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143014 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" containerName="barbican-keystone-listener-log" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143027 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerName="rabbitmq" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143036 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerName="rabbitmq" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143046 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143055 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143065 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerName="glance-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143072 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerName="glance-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143083 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143090 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143103 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143110 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143124 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9372011b-416f-484d-a873-fdda67baf9fe" containerName="mysql-bootstrap" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143131 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9372011b-416f-484d-a873-fdda67baf9fe" containerName="mysql-bootstrap" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143139 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerName="glance-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143146 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerName="glance-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143158 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="openstack-network-exporter" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143165 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="openstack-network-exporter" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143174 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143181 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143192 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-metadata" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143199 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-metadata" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143213 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a88061-cba8-4535-bf01-5285d8cbb79f" containerName="memcached" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143221 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a88061-cba8-4535-bf01-5285d8cbb79f" containerName="memcached" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143231 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408722a8-2c8a-4bda-82d5-1d2f58bda7d7" containerName="keystone-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143238 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="408722a8-2c8a-4bda-82d5-1d2f58bda7d7" containerName="keystone-api" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143249 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143257 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143266 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerName="placement-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143274 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerName="placement-api" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143288 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" containerName="barbican-keystone-listener" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143296 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" containerName="barbican-keystone-listener" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.143307 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerName="glance-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143314 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerName="glance-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143488 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7ad3ed-9144-4a21-808c-23d613354a2f" containerName="nova-cell0-conductor-conductor" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143499 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143510 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143519 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="783e3f3a-7a6f-4b95-a7d2-6988c8a6149b" containerName="ovn-controller" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143535 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="ceilometer-central-agent" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143550 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143564 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="proxy-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143573 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3e6892-7a97-4563-b339-6c3acfd36dd3" containerName="rabbitmq" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143583 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="sg-core" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143594 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a88061-cba8-4535-bf01-5285d8cbb79f" containerName="memcached" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143606 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143614 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="408722a8-2c8a-4bda-82d5-1d2f58bda7d7" containerName="keystone-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143625 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" containerName="mariadb-account-create-update" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143636 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerName="glance-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143646 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="cinder-scheduler" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143659 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143670 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerName="glance-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143680 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab479995-b87a-46b8-9a4e-d9e95d556775" containerName="glance-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143688 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="4865c8ed-670d-41a0-b9fc-ba7697085e6b" containerName="cinder-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143701 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="openstack-network-exporter" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143712 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" containerName="barbican-keystone-listener-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143721 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a933f1-aa44-4375-8f5c-e5f3567e6c8e" containerName="glance-httpd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143734 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="62814b8d-8679-4350-be7d-5f729f901846" containerName="nova-cell1-conductor-conductor" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143746 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bfe342f-267a-4239-a9cc-8df0e3d14a92" containerName="kube-state-metrics" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143757 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="81489e39-0246-4065-8835-31b1e5da8431" containerName="nova-metadata-metadata" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143765 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="515105ef-e538-4276-b682-7e05881dc7e8" containerName="nova-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143777 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" containerName="proxy-server" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143788 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerName="placement-api" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143798 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9700fb27-6a74-428d-a2e6-71c237b3e054" containerName="nova-scheduler-scheduler" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143806 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fe05756-9202-4514-8eea-0c786a2b6d56" containerName="rabbitmq" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143816 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f5a3b8-c9ca-403a-aecf-f6fbf286b145" containerName="probe" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143825 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="31690f34-6b68-4470-a13e-e16121ec25d2" containerName="barbican-keystone-listener" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143833 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c91bdf-1945-4c8a-ab05-3b619e4bdb2b" containerName="ovn-northd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143844 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9372011b-416f-484d-a873-fdda67baf9fe" containerName="galera" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143855 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f324194-64d5-4755-847b-f554b94e652c" containerName="ceilometer-notification-agent" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143863 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="981bb03c-23be-4bf8-a9f6-cb8a552f66a5" containerName="placement-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.143874 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d81dfb-640f-4748-ab70-e0b393e1e595" containerName="barbican-api-log" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.144378 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553016-d8blk" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.148822 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.149774 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.149888 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.166535 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553016-d8blk"] Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.255203 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq8tr\" (UniqueName: \"kubernetes.io/projected/5d8a7c94-9bde-4dfc-9172-9c116e26b70e-kube-api-access-kq8tr\") pod \"auto-csr-approver-29553016-d8blk\" (UID: \"5d8a7c94-9bde-4dfc-9172-9c116e26b70e\") " pod="openshift-infra/auto-csr-approver-29553016-d8blk" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.358788 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq8tr\" (UniqueName: \"kubernetes.io/projected/5d8a7c94-9bde-4dfc-9172-9c116e26b70e-kube-api-access-kq8tr\") pod \"auto-csr-approver-29553016-d8blk\" (UID: \"5d8a7c94-9bde-4dfc-9172-9c116e26b70e\") " pod="openshift-infra/auto-csr-approver-29553016-d8blk" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.360373 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.361385 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.361920 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.362928 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.365772 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.366324 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.385160 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:00 crc kubenswrapper[4919]: E0310 22:16:00.385560 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.391124 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq8tr\" (UniqueName: \"kubernetes.io/projected/5d8a7c94-9bde-4dfc-9172-9c116e26b70e-kube-api-access-kq8tr\") pod \"auto-csr-approver-29553016-d8blk\" (UID: \"5d8a7c94-9bde-4dfc-9172-9c116e26b70e\") " pod="openshift-infra/auto-csr-approver-29553016-d8blk" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.475762 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553016-d8blk" Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.917293 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553016-d8blk"] Mar 10 22:16:00 crc kubenswrapper[4919]: I0310 22:16:00.927352 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:16:01 crc kubenswrapper[4919]: I0310 22:16:01.437329 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553016-d8blk" event={"ID":"5d8a7c94-9bde-4dfc-9172-9c116e26b70e","Type":"ContainerStarted","Data":"d41148a684b150fb82b94d5fb2ec3c7d8a3ece3d9a7d673eff03f6388a1688f9"} Mar 10 22:16:02 crc kubenswrapper[4919]: I0310 22:16:02.463245 4919 generic.go:334] "Generic (PLEG): container finished" podID="5d8a7c94-9bde-4dfc-9172-9c116e26b70e" containerID="09ab8817bad628ef5a55dd8b4b7607fccf7bbf117c5ce90c546f8cebf64c396d" exitCode=0 Mar 10 22:16:02 crc kubenswrapper[4919]: I0310 22:16:02.463294 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553016-d8blk" event={"ID":"5d8a7c94-9bde-4dfc-9172-9c116e26b70e","Type":"ContainerDied","Data":"09ab8817bad628ef5a55dd8b4b7607fccf7bbf117c5ce90c546f8cebf64c396d"} Mar 10 22:16:03 crc kubenswrapper[4919]: I0310 22:16:03.814808 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553016-d8blk" Mar 10 22:16:03 crc kubenswrapper[4919]: I0310 22:16:03.937216 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq8tr\" (UniqueName: \"kubernetes.io/projected/5d8a7c94-9bde-4dfc-9172-9c116e26b70e-kube-api-access-kq8tr\") pod \"5d8a7c94-9bde-4dfc-9172-9c116e26b70e\" (UID: \"5d8a7c94-9bde-4dfc-9172-9c116e26b70e\") " Mar 10 22:16:03 crc kubenswrapper[4919]: I0310 22:16:03.942789 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8a7c94-9bde-4dfc-9172-9c116e26b70e-kube-api-access-kq8tr" (OuterVolumeSpecName: "kube-api-access-kq8tr") pod "5d8a7c94-9bde-4dfc-9172-9c116e26b70e" (UID: "5d8a7c94-9bde-4dfc-9172-9c116e26b70e"). InnerVolumeSpecName "kube-api-access-kq8tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:16:04 crc kubenswrapper[4919]: I0310 22:16:04.039134 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq8tr\" (UniqueName: \"kubernetes.io/projected/5d8a7c94-9bde-4dfc-9172-9c116e26b70e-kube-api-access-kq8tr\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:04 crc kubenswrapper[4919]: I0310 22:16:04.490931 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553016-d8blk" event={"ID":"5d8a7c94-9bde-4dfc-9172-9c116e26b70e","Type":"ContainerDied","Data":"d41148a684b150fb82b94d5fb2ec3c7d8a3ece3d9a7d673eff03f6388a1688f9"} Mar 10 22:16:04 crc kubenswrapper[4919]: I0310 22:16:04.491804 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d41148a684b150fb82b94d5fb2ec3c7d8a3ece3d9a7d673eff03f6388a1688f9" Mar 10 22:16:04 crc kubenswrapper[4919]: I0310 22:16:04.491039 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553016-d8blk" Mar 10 22:16:04 crc kubenswrapper[4919]: I0310 22:16:04.883836 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553010-qj5k8"] Mar 10 22:16:04 crc kubenswrapper[4919]: I0310 22:16:04.891609 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553010-qj5k8"] Mar 10 22:16:05 crc kubenswrapper[4919]: E0310 22:16:05.359724 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:05 crc kubenswrapper[4919]: E0310 22:16:05.360085 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:05 crc kubenswrapper[4919]: E0310 22:16:05.360511 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:05 crc kubenswrapper[4919]: E0310 22:16:05.360527 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:05 crc kubenswrapper[4919]: E0310 22:16:05.360552 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:16:05 crc kubenswrapper[4919]: E0310 22:16:05.361487 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:05 crc kubenswrapper[4919]: E0310 22:16:05.362854 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:05 crc kubenswrapper[4919]: E0310 22:16:05.362891 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.493982 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8340b4f-e490-40e1-bb64-103f5fe20225" path="/var/lib/kubelet/pods/a8340b4f-e490-40e1-bb64-103f5fe20225/volumes" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.505823 4919 generic.go:334] "Generic (PLEG): container finished" podID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerID="18dd8dafec6aecc8efed736cca0a71f9d1628505bf0335dc2c01e5a11aba23af" exitCode=0 Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.506082 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846dbc6cd5-kg4kx" event={"ID":"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d","Type":"ContainerDied","Data":"18dd8dafec6aecc8efed736cca0a71f9d1628505bf0335dc2c01e5a11aba23af"} Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.653452 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.765783 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-public-tls-certs\") pod \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.765851 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl4jk\" (UniqueName: \"kubernetes.io/projected/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-kube-api-access-wl4jk\") pod \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.765916 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config\") pod \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.765967 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config\") pod \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.765995 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-combined-ca-bundle\") pod \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.766026 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-internal-tls-certs\") pod \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.766047 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-ovndb-tls-certs\") pod \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\" (UID: \"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d\") " Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.770733 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-kube-api-access-wl4jk" (OuterVolumeSpecName: "kube-api-access-wl4jk") pod "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d"). InnerVolumeSpecName "kube-api-access-wl4jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.770965 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.802324 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config" (OuterVolumeSpecName: "config") pod "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.808734 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.809682 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.809977 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.835335 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" (UID: "0a44bcbb-6e2e-48bb-b7a7-16a4e916001d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.867156 4919 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.867194 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl4jk\" (UniqueName: \"kubernetes.io/projected/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-kube-api-access-wl4jk\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.867205 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.867216 4919 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.867224 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.867232 4919 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:05 crc kubenswrapper[4919]: I0310 22:16:05.867240 4919 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:06 crc kubenswrapper[4919]: I0310 22:16:06.517230 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-846dbc6cd5-kg4kx" event={"ID":"0a44bcbb-6e2e-48bb-b7a7-16a4e916001d","Type":"ContainerDied","Data":"badff415a7c7b021a38389a544f86ea1d866544cb7edbb98599811085fafe6ad"} Mar 10 22:16:06 crc kubenswrapper[4919]: I0310 22:16:06.517306 4919 scope.go:117] "RemoveContainer" containerID="9cdb7599c01cdc95ab93aaa9cd850cf9d1c5bc23e81939b0310b3ed9a9214bc6" Mar 10 22:16:06 crc kubenswrapper[4919]: I0310 22:16:06.518220 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-846dbc6cd5-kg4kx" Mar 10 22:16:06 crc kubenswrapper[4919]: I0310 22:16:06.541225 4919 scope.go:117] "RemoveContainer" containerID="18dd8dafec6aecc8efed736cca0a71f9d1628505bf0335dc2c01e5a11aba23af" Mar 10 22:16:06 crc kubenswrapper[4919]: I0310 22:16:06.569227 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-846dbc6cd5-kg4kx"] Mar 10 22:16:06 crc kubenswrapper[4919]: I0310 22:16:06.576056 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-846dbc6cd5-kg4kx"] Mar 10 22:16:07 crc kubenswrapper[4919]: I0310 22:16:07.496304 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" path="/var/lib/kubelet/pods/0a44bcbb-6e2e-48bb-b7a7-16a4e916001d/volumes" Mar 10 22:16:10 crc kubenswrapper[4919]: E0310 22:16:10.358675 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:10 crc kubenswrapper[4919]: E0310 22:16:10.359192 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:10 crc kubenswrapper[4919]: E0310 22:16:10.360529 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:10 crc kubenswrapper[4919]: E0310 22:16:10.361113 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:10 crc kubenswrapper[4919]: E0310 22:16:10.361287 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 22:16:10 crc kubenswrapper[4919]: E0310 22:16:10.361384 4919 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:16:10 crc kubenswrapper[4919]: E0310 22:16:10.364961 4919 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 22:16:10 crc kubenswrapper[4919]: E0310 22:16:10.365190 4919 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5wz82" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:16:12 crc kubenswrapper[4919]: I0310 22:16:12.480459 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:16:12 crc kubenswrapper[4919]: E0310 22:16:12.480867 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.604126 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5wz82_a525725f-407a-4e99-96a1-a0eaba714487/ovs-vswitchd/0.log" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.606193 4919 generic.go:334] "Generic (PLEG): container finished" podID="a525725f-407a-4e99-96a1-a0eaba714487" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" exitCode=137 Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.606276 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5wz82" event={"ID":"a525725f-407a-4e99-96a1-a0eaba714487","Type":"ContainerDied","Data":"e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5"} Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.614487 4919 generic.go:334] "Generic (PLEG): container finished" podID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerID="7d6f077cbbd4ed720f528f14962aa759a22f7956036b9e97a87b6414a73da0ba" exitCode=137 Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.614547 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"7d6f077cbbd4ed720f528f14962aa759a22f7956036b9e97a87b6414a73da0ba"} Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.868834 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5wz82_a525725f-407a-4e99-96a1-a0eaba714487/ovs-vswitchd/0.log" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.870350 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.881018 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977374 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c8bbf6-8824-4e21-a491-86f2f657549a-combined-ca-bundle\") pod \"91c8bbf6-8824-4e21-a491-86f2f657549a\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977457 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lj4w\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-kube-api-access-4lj4w\") pod \"91c8bbf6-8824-4e21-a491-86f2f657549a\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977485 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-lib\") pod \"a525725f-407a-4e99-96a1-a0eaba714487\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977519 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a525725f-407a-4e99-96a1-a0eaba714487-scripts\") pod \"a525725f-407a-4e99-96a1-a0eaba714487\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977539 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-log\") pod \"a525725f-407a-4e99-96a1-a0eaba714487\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977556 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-run\") pod \"a525725f-407a-4e99-96a1-a0eaba714487\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977634 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-cache\") pod \"91c8bbf6-8824-4e21-a491-86f2f657549a\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977655 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-lock\") pod \"91c8bbf6-8824-4e21-a491-86f2f657549a\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977689 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tgb2\" (UniqueName: \"kubernetes.io/projected/a525725f-407a-4e99-96a1-a0eaba714487-kube-api-access-5tgb2\") pod \"a525725f-407a-4e99-96a1-a0eaba714487\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977694 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-run" (OuterVolumeSpecName: "var-run") pod "a525725f-407a-4e99-96a1-a0eaba714487" (UID: "a525725f-407a-4e99-96a1-a0eaba714487"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977720 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"91c8bbf6-8824-4e21-a491-86f2f657549a\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977751 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-log" (OuterVolumeSpecName: "var-log") pod "a525725f-407a-4e99-96a1-a0eaba714487" (UID: "a525725f-407a-4e99-96a1-a0eaba714487"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977759 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-etc-ovs\") pod \"a525725f-407a-4e99-96a1-a0eaba714487\" (UID: \"a525725f-407a-4e99-96a1-a0eaba714487\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977796 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") pod \"91c8bbf6-8824-4e21-a491-86f2f657549a\" (UID: \"91c8bbf6-8824-4e21-a491-86f2f657549a\") " Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.977990 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-lib" (OuterVolumeSpecName: "var-lib") pod "a525725f-407a-4e99-96a1-a0eaba714487" (UID: "a525725f-407a-4e99-96a1-a0eaba714487"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.978080 4919 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-lib\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.978092 4919 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.978103 4919 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.978131 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "a525725f-407a-4e99-96a1-a0eaba714487" (UID: "a525725f-407a-4e99-96a1-a0eaba714487"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.978538 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-cache" (OuterVolumeSpecName: "cache") pod "91c8bbf6-8824-4e21-a491-86f2f657549a" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.978594 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-lock" (OuterVolumeSpecName: "lock") pod "91c8bbf6-8824-4e21-a491-86f2f657549a" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.978746 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a525725f-407a-4e99-96a1-a0eaba714487-scripts" (OuterVolumeSpecName: "scripts") pod "a525725f-407a-4e99-96a1-a0eaba714487" (UID: "a525725f-407a-4e99-96a1-a0eaba714487"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.983096 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a525725f-407a-4e99-96a1-a0eaba714487-kube-api-access-5tgb2" (OuterVolumeSpecName: "kube-api-access-5tgb2") pod "a525725f-407a-4e99-96a1-a0eaba714487" (UID: "a525725f-407a-4e99-96a1-a0eaba714487"). InnerVolumeSpecName "kube-api-access-5tgb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.983462 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "91c8bbf6-8824-4e21-a491-86f2f657549a" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.983565 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-kube-api-access-4lj4w" (OuterVolumeSpecName: "kube-api-access-4lj4w") pod "91c8bbf6-8824-4e21-a491-86f2f657549a" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a"). InnerVolumeSpecName "kube-api-access-4lj4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:16:13 crc kubenswrapper[4919]: I0310 22:16:13.984567 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "91c8bbf6-8824-4e21-a491-86f2f657549a" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.079635 4919 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-cache\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.079700 4919 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/91c8bbf6-8824-4e21-a491-86f2f657549a-lock\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.079715 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tgb2\" (UniqueName: \"kubernetes.io/projected/a525725f-407a-4e99-96a1-a0eaba714487-kube-api-access-5tgb2\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.079782 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.079798 4919 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a525725f-407a-4e99-96a1-a0eaba714487-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.079810 4919 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.079822 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lj4w\" (UniqueName: \"kubernetes.io/projected/91c8bbf6-8824-4e21-a491-86f2f657549a-kube-api-access-4lj4w\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.079833 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a525725f-407a-4e99-96a1-a0eaba714487-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.099160 4919 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.181198 4919 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.251555 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c8bbf6-8824-4e21-a491-86f2f657549a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91c8bbf6-8824-4e21-a491-86f2f657549a" (UID: "91c8bbf6-8824-4e21-a491-86f2f657549a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.282379 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c8bbf6-8824-4e21-a491-86f2f657549a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.624273 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5wz82_a525725f-407a-4e99-96a1-a0eaba714487/ovs-vswitchd/0.log" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.625123 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5wz82" event={"ID":"a525725f-407a-4e99-96a1-a0eaba714487","Type":"ContainerDied","Data":"8fac25358721d7097b2809907958892a7fbbaccefea7a7762f3c56886df1d01d"} Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.625166 4919 scope.go:117] "RemoveContainer" containerID="e7e7fcd09cc0c969ac9f3c21aebb85f2b23a2c01eb8ef776f788577ffa3c96d5" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.625300 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5wz82" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.654751 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"91c8bbf6-8824-4e21-a491-86f2f657549a","Type":"ContainerDied","Data":"1d4948d46b248f570bd67c036de6ebdf57bd25a57b9b44ba8b1416c577c5779d"} Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.654858 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.664317 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-5wz82"] Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.668474 4919 scope.go:117] "RemoveContainer" containerID="85e85aa8a7e78a2f2c6fc7044ebf9c1dcb554abd0c952c366102b9a1f2fa0880" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.670893 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-5wz82"] Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.689579 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.693418 4919 scope.go:117] "RemoveContainer" containerID="d7ed0dd7cc3ffd81be36edf66e7963bea6d12e052bbb3d488461d7172dab3b1a" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.695217 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.717731 4919 scope.go:117] "RemoveContainer" containerID="7d6f077cbbd4ed720f528f14962aa759a22f7956036b9e97a87b6414a73da0ba" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.738072 4919 scope.go:117] "RemoveContainer" containerID="d4f089226b859cf9e472a23e4abfff98df12043dbab19a145b4c4fdfb8923fe6" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.756407 4919 scope.go:117] "RemoveContainer" containerID="b9b5e9d2ec6d050219cb0112cd09d62a15be687c7cb44e610e55e8bc795ce60f" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.772760 4919 scope.go:117] "RemoveContainer" containerID="ae368876337c9d3b40fae442133c17eb2b857ea33f295ec663500c6b4ebd5fb3" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.789657 4919 scope.go:117] "RemoveContainer" containerID="48c8ed23f216106829b3f2774da116741d89515f11a103cf91248086130b0d18" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.806790 4919 scope.go:117] "RemoveContainer" containerID="fe3a820aaaecc3dbdc2c00e963ea20455282c2cd0ff782ad5b053eaa18c2a728" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.823715 4919 scope.go:117] "RemoveContainer" containerID="1ab1280d2d068201d8bde5433767f956a9a5d9d033ba36ade9a16c74b928af27" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.855173 4919 scope.go:117] "RemoveContainer" containerID="ea01dce7b06601c355d1e0c1f5bc23af1e7381afe0f09aa8a98efd1dceeab0e9" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.873690 4919 scope.go:117] "RemoveContainer" containerID="5f25a2d98dbdc44c683281bef776d43cbc4121e4fa9cb5254edd868686128030" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.890406 4919 scope.go:117] "RemoveContainer" containerID="8e08fa3055d8aa50b60d141bc84fe0f51f25e9fc45728c0dd1491d1bc7a66860" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.907227 4919 scope.go:117] "RemoveContainer" containerID="75261d329b8223b1135cf1458a80f97a9d45e26831f1353325eb35731b37f5d2" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.923793 4919 scope.go:117] "RemoveContainer" containerID="8406c65e24e9f21076831c641cab8e21b69c623c1d92a90d609d4a3c30670852" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.940710 4919 scope.go:117] "RemoveContainer" containerID="f84605aa41e553170463de5ffc1ffb79d9063aeb307fca1c7396bcab45897e17" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.963876 4919 scope.go:117] "RemoveContainer" containerID="2c39ba342378f6fa53b3c9079da22731dd39cfe3e751a36987c68151f2ee3a38" Mar 10 22:16:14 crc kubenswrapper[4919]: I0310 22:16:14.990418 4919 scope.go:117] "RemoveContainer" containerID="d87f46352498df2659c2cb9f7933207812d2b038ce489fc1d9131bb625190926" Mar 10 22:16:15 crc kubenswrapper[4919]: I0310 22:16:15.502276 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" path="/var/lib/kubelet/pods/91c8bbf6-8824-4e21-a491-86f2f657549a/volumes" Mar 10 22:16:15 crc kubenswrapper[4919]: I0310 22:16:15.504000 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a525725f-407a-4e99-96a1-a0eaba714487" path="/var/lib/kubelet/pods/a525725f-407a-4e99-96a1-a0eaba714487/volumes" Mar 10 22:16:18 crc kubenswrapper[4919]: I0310 22:16:18.118429 4919 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd0f89c3b-5242-409b-a318-5b69410e9680"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd0f89c3b-5242-409b-a318-5b69410e9680] : Timed out while waiting for systemd to remove kubepods-besteffort-podd0f89c3b_5242_409b_a318_5b69410e9680.slice" Mar 10 22:16:18 crc kubenswrapper[4919]: E0310 22:16:18.118816 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podd0f89c3b-5242-409b-a318-5b69410e9680] : unable to destroy cgroup paths for cgroup [kubepods besteffort podd0f89c3b-5242-409b-a318-5b69410e9680] : Timed out while waiting for systemd to remove kubepods-besteffort-podd0f89c3b_5242_409b_a318_5b69410e9680.slice" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" Mar 10 22:16:18 crc kubenswrapper[4919]: I0310 22:16:18.687873 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-fbf4c94d9-4mg9b" Mar 10 22:16:18 crc kubenswrapper[4919]: I0310 22:16:18.711167 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-fbf4c94d9-4mg9b"] Mar 10 22:16:18 crc kubenswrapper[4919]: I0310 22:16:18.716817 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-fbf4c94d9-4mg9b"] Mar 10 22:16:19 crc kubenswrapper[4919]: I0310 22:16:19.489924 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f89c3b-5242-409b-a318-5b69410e9680" path="/var/lib/kubelet/pods/d0f89c3b-5242-409b-a318-5b69410e9680/volumes" Mar 10 22:16:27 crc kubenswrapper[4919]: I0310 22:16:27.480839 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:16:27 crc kubenswrapper[4919]: E0310 22:16:27.482160 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:16:40 crc kubenswrapper[4919]: I0310 22:16:40.480300 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:16:40 crc kubenswrapper[4919]: E0310 22:16:40.481418 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:16:40 crc kubenswrapper[4919]: I0310 22:16:40.851771 4919 scope.go:117] "RemoveContainer" containerID="2dd8adb46bd856ff5f07b26aebb46c78dd38855252e1e7b88749967541cb88e9" Mar 10 22:16:54 crc kubenswrapper[4919]: I0310 22:16:54.480199 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:16:54 crc kubenswrapper[4919]: E0310 22:16:54.481239 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:17:08 crc kubenswrapper[4919]: I0310 22:17:08.480029 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:17:08 crc kubenswrapper[4919]: E0310 22:17:08.480965 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:17:21 crc kubenswrapper[4919]: I0310 22:17:21.480276 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:17:21 crc kubenswrapper[4919]: E0310 22:17:21.481128 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:17:32 crc kubenswrapper[4919]: I0310 22:17:32.479526 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:17:32 crc kubenswrapper[4919]: E0310 22:17:32.480248 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.384680 4919 scope.go:117] "RemoveContainer" containerID="677037ec6fc3ac6df63ece5407c9f84f545fc7d8e698c763107303f155ca69f6" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.409691 4919 scope.go:117] "RemoveContainer" containerID="6348edd095c11b2dfbe689495aad97ef5ace707e77fe77df248c5adecd0d8c19" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.437722 4919 scope.go:117] "RemoveContainer" containerID="38f8de90e88a2382619b837ef63b573edfd9808dacdc190631ae0849effe576d" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.484645 4919 scope.go:117] "RemoveContainer" containerID="f19fe15f09263bcc48907f378bd5f9fe918a057a8af224365585cbf7e60cb441" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.508868 4919 scope.go:117] "RemoveContainer" containerID="fd7c8f0b306e73496827ddf94aa00307911195ef6e69e874d9d894cdee7e72a3" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.530203 4919 scope.go:117] "RemoveContainer" containerID="f8a9ebb35a9684e930bb7e44a0621ffb15e0b669cb105b45857980e222bb8a9c" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.549558 4919 scope.go:117] "RemoveContainer" containerID="f1cfb380cb13d63aaa674f6ff137ef0d4367d24b15d6edf7643da148391493ad" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.568209 4919 scope.go:117] "RemoveContainer" containerID="d260e7d4813d4ebda28235e9d31249a155784dc638f6003f6270f056f891383a" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.586106 4919 scope.go:117] "RemoveContainer" containerID="24ec58e2a80c23809125a023c1cce17ddcc1f52fe9b8a224d0e7dd4ad8e312a9" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.605020 4919 scope.go:117] "RemoveContainer" containerID="0bd792ea437cac42db6d52a44652671ab655e56b2f2eee19cc85a6060157c0a3" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.621811 4919 scope.go:117] "RemoveContainer" containerID="8e1955911205622f71307953f67465667ebb2fac9ec9d00869e93c81c2720854" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.651093 4919 scope.go:117] "RemoveContainer" containerID="4acacbb159ddc3d2dcc05f7a2181c4a0198ea30fd738a3acdcd1ed55e18abf6b" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.679925 4919 scope.go:117] "RemoveContainer" containerID="df3255a2d0759e4981bfd7e03b09952206e74469af65e2de0fe714948a7f652b" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.720132 4919 scope.go:117] "RemoveContainer" containerID="55306eee2a2abd644ed6e52709eaa16a01bae893d58ffb9f0076970f427c4738" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.747479 4919 scope.go:117] "RemoveContainer" containerID="99ab815c089630a419af1f8eee02aa01e609ba044e18bf1dea4c234c8e02a57b" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.767438 4919 scope.go:117] "RemoveContainer" containerID="7f67034bb3fc927d4123b73a239402611e74fab959e166ee008ced152b098c44" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.782473 4919 scope.go:117] "RemoveContainer" containerID="4f99e6cf36c4ed13e551e93f2fa19e48205056c672df3ef2acaf8d7e18c63129" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.797162 4919 scope.go:117] "RemoveContainer" containerID="05a129c14cde6b4a62a3b17eec8fd6a1dbb19c5c9e8518261a398182a085ee06" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.815538 4919 scope.go:117] "RemoveContainer" containerID="17c37ab7d9a4d15efad770799d8f1b0cb204d67a5cf81f20458d249be5df213d" Mar 10 22:17:41 crc kubenswrapper[4919]: I0310 22:17:41.831656 4919 scope.go:117] "RemoveContainer" containerID="1cbaf6bf606b116c1dcc6c8ceb022ba5bf13acdeef539d78bf1c08e4dba722aa" Mar 10 22:17:45 crc kubenswrapper[4919]: I0310 22:17:45.479931 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:17:45 crc kubenswrapper[4919]: E0310 22:17:45.480427 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.000514 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b28jz"] Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001254 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001278 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-server" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001299 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-expirer" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001309 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-expirer" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001328 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8a7c94-9bde-4dfc-9172-9c116e26b70e" containerName="oc" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001339 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8a7c94-9bde-4dfc-9172-9c116e26b70e" containerName="oc" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001352 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001363 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001377 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-reaper" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001386 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-reaper" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001419 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="rsync" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001429 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="rsync" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001445 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerName="neutron-httpd" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001455 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerName="neutron-httpd" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001474 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001484 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001508 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001520 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001538 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001548 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001559 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-updater" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001569 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-updater" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001582 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001592 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-server" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001606 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001616 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001631 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerName="neutron-api" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001640 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerName="neutron-api" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001654 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001665 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001683 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-updater" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001693 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-updater" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001708 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="swift-recon-cron" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001718 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="swift-recon-cron" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001739 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001748 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001764 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001773 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001789 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server-init" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001799 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server-init" Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.001812 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.001822 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002057 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerName="neutron-api" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002082 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="rsync" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002102 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovsdb-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002116 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a514a0-0d4c-4f6b-8ba7-cd5b4834d625" containerName="mariadb-account-create-update" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002132 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-reaper" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002149 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a44bcbb-6e2e-48bb-b7a7-16a4e916001d" containerName="neutron-httpd" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002163 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002181 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002200 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8a7c94-9bde-4dfc-9172-9c116e26b70e" containerName="oc" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002218 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002235 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002247 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-auditor" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002262 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002274 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="swift-recon-cron" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002285 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="account-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002301 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-server" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002316 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-replicator" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002331 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="container-updater" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002363 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a525725f-407a-4e99-96a1-a0eaba714487" containerName="ovs-vswitchd" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002383 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-updater" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.002415 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c8bbf6-8824-4e21-a491-86f2f657549a" containerName="object-expirer" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.003916 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.018932 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b28jz"] Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.145133 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-catalog-content\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.145237 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-utilities\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.145311 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvr6z\" (UniqueName: \"kubernetes.io/projected/221e35cf-a234-41f3-82e6-e9b9c14ea121-kube-api-access-lvr6z\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.246381 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-utilities\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.246483 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvr6z\" (UniqueName: \"kubernetes.io/projected/221e35cf-a234-41f3-82e6-e9b9c14ea121-kube-api-access-lvr6z\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.246512 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-catalog-content\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.246934 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-utilities\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.246957 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-catalog-content\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.265313 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvr6z\" (UniqueName: \"kubernetes.io/projected/221e35cf-a234-41f3-82e6-e9b9c14ea121-kube-api-access-lvr6z\") pod \"certified-operators-b28jz\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.322715 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:49 crc kubenswrapper[4919]: I0310 22:17:49.661777 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b28jz"] Mar 10 22:17:49 crc kubenswrapper[4919]: E0310 22:17:49.986870 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221e35cf_a234_41f3_82e6_e9b9c14ea121.slice/crio-conmon-bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221e35cf_a234_41f3_82e6_e9b9c14ea121.slice/crio-bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e.scope\": RecentStats: unable to find data in memory cache]" Mar 10 22:17:50 crc kubenswrapper[4919]: I0310 22:17:50.544988 4919 generic.go:334] "Generic (PLEG): container finished" podID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerID="bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e" exitCode=0 Mar 10 22:17:50 crc kubenswrapper[4919]: I0310 22:17:50.545032 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b28jz" event={"ID":"221e35cf-a234-41f3-82e6-e9b9c14ea121","Type":"ContainerDied","Data":"bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e"} Mar 10 22:17:50 crc kubenswrapper[4919]: I0310 22:17:50.545058 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b28jz" event={"ID":"221e35cf-a234-41f3-82e6-e9b9c14ea121","Type":"ContainerStarted","Data":"125a6579353d02788da99287235485c106cebde00a34d693788dee89ed448ed5"} Mar 10 22:17:51 crc kubenswrapper[4919]: I0310 22:17:51.556164 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b28jz" event={"ID":"221e35cf-a234-41f3-82e6-e9b9c14ea121","Type":"ContainerStarted","Data":"22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a"} Mar 10 22:17:52 crc kubenswrapper[4919]: I0310 22:17:52.564035 4919 generic.go:334] "Generic (PLEG): container finished" podID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerID="22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a" exitCode=0 Mar 10 22:17:52 crc kubenswrapper[4919]: I0310 22:17:52.564077 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b28jz" event={"ID":"221e35cf-a234-41f3-82e6-e9b9c14ea121","Type":"ContainerDied","Data":"22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a"} Mar 10 22:17:53 crc kubenswrapper[4919]: I0310 22:17:53.574271 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b28jz" event={"ID":"221e35cf-a234-41f3-82e6-e9b9c14ea121","Type":"ContainerStarted","Data":"cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e"} Mar 10 22:17:53 crc kubenswrapper[4919]: I0310 22:17:53.592360 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b28jz" podStartSLOduration=3.043225892 podStartE2EDuration="5.592340767s" podCreationTimestamp="2026-03-10 22:17:48 +0000 UTC" firstStartedPulling="2026-03-10 22:17:50.548140909 +0000 UTC m=+1657.790021517" lastFinishedPulling="2026-03-10 22:17:53.097255764 +0000 UTC m=+1660.339136392" observedRunningTime="2026-03-10 22:17:53.587614939 +0000 UTC m=+1660.829495557" watchObservedRunningTime="2026-03-10 22:17:53.592340767 +0000 UTC m=+1660.834221375" Mar 10 22:17:58 crc kubenswrapper[4919]: I0310 22:17:58.480523 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:17:58 crc kubenswrapper[4919]: E0310 22:17:58.481066 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:17:59 crc kubenswrapper[4919]: I0310 22:17:59.323912 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:59 crc kubenswrapper[4919]: I0310 22:17:59.323959 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:59 crc kubenswrapper[4919]: I0310 22:17:59.378263 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:59 crc kubenswrapper[4919]: I0310 22:17:59.685373 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:17:59 crc kubenswrapper[4919]: I0310 22:17:59.735051 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b28jz"] Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.155323 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553018-9b8dk"] Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.156752 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553018-9b8dk" Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.159773 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.160299 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.162056 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.171552 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553018-9b8dk"] Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.308959 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2bfn\" (UniqueName: \"kubernetes.io/projected/83ace4ea-6c97-47a9-bdc2-bf1d4a75d030-kube-api-access-d2bfn\") pod \"auto-csr-approver-29553018-9b8dk\" (UID: \"83ace4ea-6c97-47a9-bdc2-bf1d4a75d030\") " pod="openshift-infra/auto-csr-approver-29553018-9b8dk" Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.410018 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2bfn\" (UniqueName: \"kubernetes.io/projected/83ace4ea-6c97-47a9-bdc2-bf1d4a75d030-kube-api-access-d2bfn\") pod \"auto-csr-approver-29553018-9b8dk\" (UID: \"83ace4ea-6c97-47a9-bdc2-bf1d4a75d030\") " pod="openshift-infra/auto-csr-approver-29553018-9b8dk" Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.432064 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2bfn\" (UniqueName: \"kubernetes.io/projected/83ace4ea-6c97-47a9-bdc2-bf1d4a75d030-kube-api-access-d2bfn\") pod \"auto-csr-approver-29553018-9b8dk\" (UID: \"83ace4ea-6c97-47a9-bdc2-bf1d4a75d030\") " pod="openshift-infra/auto-csr-approver-29553018-9b8dk" Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.481381 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553018-9b8dk" Mar 10 22:18:00 crc kubenswrapper[4919]: I0310 22:18:00.937456 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553018-9b8dk"] Mar 10 22:18:00 crc kubenswrapper[4919]: W0310 22:18:00.947400 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83ace4ea_6c97_47a9_bdc2_bf1d4a75d030.slice/crio-9cfd4ac29af8bd6710a59c59a802b85fe855442ff1bb7066984a2ae6ca32ec13 WatchSource:0}: Error finding container 9cfd4ac29af8bd6710a59c59a802b85fe855442ff1bb7066984a2ae6ca32ec13: Status 404 returned error can't find the container with id 9cfd4ac29af8bd6710a59c59a802b85fe855442ff1bb7066984a2ae6ca32ec13 Mar 10 22:18:01 crc kubenswrapper[4919]: I0310 22:18:01.673495 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553018-9b8dk" event={"ID":"83ace4ea-6c97-47a9-bdc2-bf1d4a75d030","Type":"ContainerStarted","Data":"9cfd4ac29af8bd6710a59c59a802b85fe855442ff1bb7066984a2ae6ca32ec13"} Mar 10 22:18:01 crc kubenswrapper[4919]: I0310 22:18:01.673634 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b28jz" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerName="registry-server" containerID="cri-o://cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e" gracePeriod=2 Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.050767 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.239573 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-utilities\") pod \"221e35cf-a234-41f3-82e6-e9b9c14ea121\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.239634 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-catalog-content\") pod \"221e35cf-a234-41f3-82e6-e9b9c14ea121\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.239665 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvr6z\" (UniqueName: \"kubernetes.io/projected/221e35cf-a234-41f3-82e6-e9b9c14ea121-kube-api-access-lvr6z\") pod \"221e35cf-a234-41f3-82e6-e9b9c14ea121\" (UID: \"221e35cf-a234-41f3-82e6-e9b9c14ea121\") " Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.240782 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-utilities" (OuterVolumeSpecName: "utilities") pod "221e35cf-a234-41f3-82e6-e9b9c14ea121" (UID: "221e35cf-a234-41f3-82e6-e9b9c14ea121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.251546 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221e35cf-a234-41f3-82e6-e9b9c14ea121-kube-api-access-lvr6z" (OuterVolumeSpecName: "kube-api-access-lvr6z") pod "221e35cf-a234-41f3-82e6-e9b9c14ea121" (UID: "221e35cf-a234-41f3-82e6-e9b9c14ea121"). InnerVolumeSpecName "kube-api-access-lvr6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.305758 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "221e35cf-a234-41f3-82e6-e9b9c14ea121" (UID: "221e35cf-a234-41f3-82e6-e9b9c14ea121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.341047 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.341096 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221e35cf-a234-41f3-82e6-e9b9c14ea121-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.341108 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvr6z\" (UniqueName: \"kubernetes.io/projected/221e35cf-a234-41f3-82e6-e9b9c14ea121-kube-api-access-lvr6z\") on node \"crc\" DevicePath \"\"" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.686889 4919 generic.go:334] "Generic (PLEG): container finished" podID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerID="cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e" exitCode=0 Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.686959 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b28jz" event={"ID":"221e35cf-a234-41f3-82e6-e9b9c14ea121","Type":"ContainerDied","Data":"cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e"} Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.686977 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b28jz" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.687029 4919 scope.go:117] "RemoveContainer" containerID="cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.686984 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b28jz" event={"ID":"221e35cf-a234-41f3-82e6-e9b9c14ea121","Type":"ContainerDied","Data":"125a6579353d02788da99287235485c106cebde00a34d693788dee89ed448ed5"} Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.689887 4919 generic.go:334] "Generic (PLEG): container finished" podID="83ace4ea-6c97-47a9-bdc2-bf1d4a75d030" containerID="cbef5380b0f4b3c785f97e656aac3c8c7514482b21b0c4b038d3103bd0f07797" exitCode=0 Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.689934 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553018-9b8dk" event={"ID":"83ace4ea-6c97-47a9-bdc2-bf1d4a75d030","Type":"ContainerDied","Data":"cbef5380b0f4b3c785f97e656aac3c8c7514482b21b0c4b038d3103bd0f07797"} Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.710754 4919 scope.go:117] "RemoveContainer" containerID="22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.726038 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b28jz"] Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.733945 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b28jz"] Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.745791 4919 scope.go:117] "RemoveContainer" containerID="bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.760423 4919 scope.go:117] "RemoveContainer" containerID="cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e" Mar 10 22:18:02 crc kubenswrapper[4919]: E0310 22:18:02.760844 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e\": container with ID starting with cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e not found: ID does not exist" containerID="cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.760871 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e"} err="failed to get container status \"cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e\": rpc error: code = NotFound desc = could not find container \"cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e\": container with ID starting with cb0530c6ad3874fc12c32c37112b6580fd12ccb0ed298bc833079e4b1d67304e not found: ID does not exist" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.760889 4919 scope.go:117] "RemoveContainer" containerID="22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a" Mar 10 22:18:02 crc kubenswrapper[4919]: E0310 22:18:02.761163 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a\": container with ID starting with 22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a not found: ID does not exist" containerID="22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.761187 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a"} err="failed to get container status \"22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a\": rpc error: code = NotFound desc = could not find container \"22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a\": container with ID starting with 22e6bbf7fa5d4a5f433bba36c80df2574f4282cd0c1d9bc811b9565509994e5a not found: ID does not exist" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.761203 4919 scope.go:117] "RemoveContainer" containerID="bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e" Mar 10 22:18:02 crc kubenswrapper[4919]: E0310 22:18:02.761494 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e\": container with ID starting with bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e not found: ID does not exist" containerID="bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e" Mar 10 22:18:02 crc kubenswrapper[4919]: I0310 22:18:02.761602 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e"} err="failed to get container status \"bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e\": rpc error: code = NotFound desc = could not find container \"bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e\": container with ID starting with bc560d3f72c4f4b13ddb39089e5e3fae23ec07924d7b82695f68c95f3c445d7e not found: ID does not exist" Mar 10 22:18:03 crc kubenswrapper[4919]: I0310 22:18:03.489339 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" path="/var/lib/kubelet/pods/221e35cf-a234-41f3-82e6-e9b9c14ea121/volumes" Mar 10 22:18:03 crc kubenswrapper[4919]: I0310 22:18:03.965120 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553018-9b8dk" Mar 10 22:18:04 crc kubenswrapper[4919]: I0310 22:18:04.067726 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2bfn\" (UniqueName: \"kubernetes.io/projected/83ace4ea-6c97-47a9-bdc2-bf1d4a75d030-kube-api-access-d2bfn\") pod \"83ace4ea-6c97-47a9-bdc2-bf1d4a75d030\" (UID: \"83ace4ea-6c97-47a9-bdc2-bf1d4a75d030\") " Mar 10 22:18:04 crc kubenswrapper[4919]: I0310 22:18:04.075899 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ace4ea-6c97-47a9-bdc2-bf1d4a75d030-kube-api-access-d2bfn" (OuterVolumeSpecName: "kube-api-access-d2bfn") pod "83ace4ea-6c97-47a9-bdc2-bf1d4a75d030" (UID: "83ace4ea-6c97-47a9-bdc2-bf1d4a75d030"). InnerVolumeSpecName "kube-api-access-d2bfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:18:04 crc kubenswrapper[4919]: I0310 22:18:04.169268 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2bfn\" (UniqueName: \"kubernetes.io/projected/83ace4ea-6c97-47a9-bdc2-bf1d4a75d030-kube-api-access-d2bfn\") on node \"crc\" DevicePath \"\"" Mar 10 22:18:04 crc kubenswrapper[4919]: I0310 22:18:04.708075 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553018-9b8dk" event={"ID":"83ace4ea-6c97-47a9-bdc2-bf1d4a75d030","Type":"ContainerDied","Data":"9cfd4ac29af8bd6710a59c59a802b85fe855442ff1bb7066984a2ae6ca32ec13"} Mar 10 22:18:04 crc kubenswrapper[4919]: I0310 22:18:04.708113 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cfd4ac29af8bd6710a59c59a802b85fe855442ff1bb7066984a2ae6ca32ec13" Mar 10 22:18:04 crc kubenswrapper[4919]: I0310 22:18:04.708159 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553018-9b8dk" Mar 10 22:18:05 crc kubenswrapper[4919]: I0310 22:18:05.047052 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553012-5ftd6"] Mar 10 22:18:05 crc kubenswrapper[4919]: I0310 22:18:05.055084 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553012-5ftd6"] Mar 10 22:18:05 crc kubenswrapper[4919]: I0310 22:18:05.493679 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76635374-85f5-4577-866f-5f561c5223df" path="/var/lib/kubelet/pods/76635374-85f5-4577-866f-5f561c5223df/volumes" Mar 10 22:18:12 crc kubenswrapper[4919]: I0310 22:18:12.479437 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:18:12 crc kubenswrapper[4919]: E0310 22:18:12.480153 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:18:27 crc kubenswrapper[4919]: I0310 22:18:27.480341 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:18:27 crc kubenswrapper[4919]: E0310 22:18:27.481305 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:18:40 crc kubenswrapper[4919]: I0310 22:18:40.479527 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:18:40 crc kubenswrapper[4919]: E0310 22:18:40.480272 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.126674 4919 scope.go:117] "RemoveContainer" containerID="aff95dc2fe60966146b7d53443620fb83fd82b6ea8fa70ab79dfd44bdb8d6acd" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.166283 4919 scope.go:117] "RemoveContainer" containerID="2b30e15d87c6fddad7f6c0a030db8b5f1462206ab3c95932ffc9a7cef934c3ef" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.191289 4919 scope.go:117] "RemoveContainer" containerID="e4d70f78ccff4f0649cd6b2f0b66c626faab3e22bb0695b32e2d8f790b8b831d" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.212420 4919 scope.go:117] "RemoveContainer" containerID="bb3bb3d24b551528222b912f6c8926a0f15b9dfebc9af7fbb028f9ffe7e9157b" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.247847 4919 scope.go:117] "RemoveContainer" containerID="236e9edb8142b4785375f2f9d21591aeca381142891f25109c78d197b1c4208e" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.286976 4919 scope.go:117] "RemoveContainer" containerID="db27c753dfd8df1b990a2d95eeec2891aa5193e1c72a29e062a7115cc6c131ca" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.322668 4919 scope.go:117] "RemoveContainer" containerID="77dbfb1aefcdb836ebf686fec318f0978773a19f038987b02ed6672bc1025d92" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.359153 4919 scope.go:117] "RemoveContainer" containerID="8115aff6a5ec07bd6d211cbd9c4a87d9e5ee80d167242486efee63b7a58a33d9" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.380975 4919 scope.go:117] "RemoveContainer" containerID="e41c9011e8878fcf4a11e3a413f329408bd2c3dead8cb0c09dc6db523f0244ab" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.408968 4919 scope.go:117] "RemoveContainer" containerID="aebd460c1524d22d0ce665b0b841469a1c52d0bdede4cadc1c9e4e9f42ec6354" Mar 10 22:18:42 crc kubenswrapper[4919]: I0310 22:18:42.428858 4919 scope.go:117] "RemoveContainer" containerID="d88c0958bf40600808f9977f231a5fa1e34419ab9909560a692746f53c31f4f0" Mar 10 22:18:53 crc kubenswrapper[4919]: I0310 22:18:53.485371 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:18:53 crc kubenswrapper[4919]: E0310 22:18:53.488694 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:19:08 crc kubenswrapper[4919]: I0310 22:19:08.480688 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:19:08 crc kubenswrapper[4919]: E0310 22:19:08.482677 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:19:22 crc kubenswrapper[4919]: I0310 22:19:22.480505 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:19:22 crc kubenswrapper[4919]: E0310 22:19:22.481475 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:19:33 crc kubenswrapper[4919]: I0310 22:19:33.485526 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:19:33 crc kubenswrapper[4919]: E0310 22:19:33.486380 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:19:42 crc kubenswrapper[4919]: I0310 22:19:42.620850 4919 scope.go:117] "RemoveContainer" containerID="942d637771ae9486432f6cce158ed7da90a899d905ff8b6f64d0376dfe1fb5a3" Mar 10 22:19:42 crc kubenswrapper[4919]: I0310 22:19:42.643514 4919 scope.go:117] "RemoveContainer" containerID="046aa9ef267aea44a0077d8321ee3d8194793ad759fea7abc91c42878cf2fddd" Mar 10 22:19:42 crc kubenswrapper[4919]: I0310 22:19:42.694722 4919 scope.go:117] "RemoveContainer" containerID="e6abbc8c723ca7696038307a22953b5386b9ff8c8a9d68a9824b94a0392c584f" Mar 10 22:19:42 crc kubenswrapper[4919]: I0310 22:19:42.717172 4919 scope.go:117] "RemoveContainer" containerID="28558d9c5a4011f4f1726c6720180e234d9f532065398309212635d2b44ca8dc" Mar 10 22:19:42 crc kubenswrapper[4919]: I0310 22:19:42.742673 4919 scope.go:117] "RemoveContainer" containerID="144ff57e69648189dc612fb89f4a4cc6778d18f6b1badec4db262b0b5f0688da" Mar 10 22:19:42 crc kubenswrapper[4919]: I0310 22:19:42.766790 4919 scope.go:117] "RemoveContainer" containerID="d6a06432e7a87025091d7e31257f5fa7aa4053b2b47174460ef24368894a7ad6" Mar 10 22:19:42 crc kubenswrapper[4919]: I0310 22:19:42.805879 4919 scope.go:117] "RemoveContainer" containerID="ed00eadc0e3031dc348aef4bf08402de8320f03c5ea490e9824db3c427f042fe" Mar 10 22:19:48 crc kubenswrapper[4919]: I0310 22:19:48.480448 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:19:48 crc kubenswrapper[4919]: E0310 22:19:48.480938 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.140000 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553020-6rjdv"] Mar 10 22:20:00 crc kubenswrapper[4919]: E0310 22:20:00.141214 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerName="extract-utilities" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.141295 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerName="extract-utilities" Mar 10 22:20:00 crc kubenswrapper[4919]: E0310 22:20:00.141309 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerName="extract-content" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.141320 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerName="extract-content" Mar 10 22:20:00 crc kubenswrapper[4919]: E0310 22:20:00.141405 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ace4ea-6c97-47a9-bdc2-bf1d4a75d030" containerName="oc" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.141420 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ace4ea-6c97-47a9-bdc2-bf1d4a75d030" containerName="oc" Mar 10 22:20:00 crc kubenswrapper[4919]: E0310 22:20:00.141439 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerName="registry-server" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.141446 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerName="registry-server" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.141670 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ace4ea-6c97-47a9-bdc2-bf1d4a75d030" containerName="oc" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.141701 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="221e35cf-a234-41f3-82e6-e9b9c14ea121" containerName="registry-server" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.142548 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553020-6rjdv" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.144668 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.144981 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.145120 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.148531 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553020-6rjdv"] Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.261961 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfg6t\" (UniqueName: \"kubernetes.io/projected/ada4c843-4330-4fb1-9905-4bf620b86429-kube-api-access-wfg6t\") pod \"auto-csr-approver-29553020-6rjdv\" (UID: \"ada4c843-4330-4fb1-9905-4bf620b86429\") " pod="openshift-infra/auto-csr-approver-29553020-6rjdv" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.363553 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfg6t\" (UniqueName: \"kubernetes.io/projected/ada4c843-4330-4fb1-9905-4bf620b86429-kube-api-access-wfg6t\") pod \"auto-csr-approver-29553020-6rjdv\" (UID: \"ada4c843-4330-4fb1-9905-4bf620b86429\") " pod="openshift-infra/auto-csr-approver-29553020-6rjdv" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.385439 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfg6t\" (UniqueName: \"kubernetes.io/projected/ada4c843-4330-4fb1-9905-4bf620b86429-kube-api-access-wfg6t\") pod \"auto-csr-approver-29553020-6rjdv\" (UID: \"ada4c843-4330-4fb1-9905-4bf620b86429\") " pod="openshift-infra/auto-csr-approver-29553020-6rjdv" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.464311 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553020-6rjdv" Mar 10 22:20:00 crc kubenswrapper[4919]: I0310 22:20:00.859865 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553020-6rjdv"] Mar 10 22:20:01 crc kubenswrapper[4919]: I0310 22:20:01.480688 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:20:01 crc kubenswrapper[4919]: E0310 22:20:01.481230 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:20:01 crc kubenswrapper[4919]: I0310 22:20:01.690942 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553020-6rjdv" event={"ID":"ada4c843-4330-4fb1-9905-4bf620b86429","Type":"ContainerStarted","Data":"acc3fd75b142b6efbeddfcf5f827c2133e6497fb9bf726580c6073dbe965ad26"} Mar 10 22:20:02 crc kubenswrapper[4919]: I0310 22:20:02.699919 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553020-6rjdv" event={"ID":"ada4c843-4330-4fb1-9905-4bf620b86429","Type":"ContainerStarted","Data":"a4378340f4b73f5db606a12c70fcc405834febee26a1309f682de200dbedcae7"} Mar 10 22:20:03 crc kubenswrapper[4919]: I0310 22:20:03.707343 4919 generic.go:334] "Generic (PLEG): container finished" podID="ada4c843-4330-4fb1-9905-4bf620b86429" containerID="a4378340f4b73f5db606a12c70fcc405834febee26a1309f682de200dbedcae7" exitCode=0 Mar 10 22:20:03 crc kubenswrapper[4919]: I0310 22:20:03.707406 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553020-6rjdv" event={"ID":"ada4c843-4330-4fb1-9905-4bf620b86429","Type":"ContainerDied","Data":"a4378340f4b73f5db606a12c70fcc405834febee26a1309f682de200dbedcae7"} Mar 10 22:20:04 crc kubenswrapper[4919]: I0310 22:20:04.974274 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553020-6rjdv" Mar 10 22:20:05 crc kubenswrapper[4919]: I0310 22:20:05.129214 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfg6t\" (UniqueName: \"kubernetes.io/projected/ada4c843-4330-4fb1-9905-4bf620b86429-kube-api-access-wfg6t\") pod \"ada4c843-4330-4fb1-9905-4bf620b86429\" (UID: \"ada4c843-4330-4fb1-9905-4bf620b86429\") " Mar 10 22:20:05 crc kubenswrapper[4919]: I0310 22:20:05.134632 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada4c843-4330-4fb1-9905-4bf620b86429-kube-api-access-wfg6t" (OuterVolumeSpecName: "kube-api-access-wfg6t") pod "ada4c843-4330-4fb1-9905-4bf620b86429" (UID: "ada4c843-4330-4fb1-9905-4bf620b86429"). InnerVolumeSpecName "kube-api-access-wfg6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:20:05 crc kubenswrapper[4919]: I0310 22:20:05.231025 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfg6t\" (UniqueName: \"kubernetes.io/projected/ada4c843-4330-4fb1-9905-4bf620b86429-kube-api-access-wfg6t\") on node \"crc\" DevicePath \"\"" Mar 10 22:20:05 crc kubenswrapper[4919]: I0310 22:20:05.727186 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553020-6rjdv" event={"ID":"ada4c843-4330-4fb1-9905-4bf620b86429","Type":"ContainerDied","Data":"acc3fd75b142b6efbeddfcf5f827c2133e6497fb9bf726580c6073dbe965ad26"} Mar 10 22:20:05 crc kubenswrapper[4919]: I0310 22:20:05.727234 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553020-6rjdv" Mar 10 22:20:05 crc kubenswrapper[4919]: I0310 22:20:05.727249 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc3fd75b142b6efbeddfcf5f827c2133e6497fb9bf726580c6073dbe965ad26" Mar 10 22:20:05 crc kubenswrapper[4919]: I0310 22:20:05.783944 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553014-d5fgg"] Mar 10 22:20:05 crc kubenswrapper[4919]: I0310 22:20:05.790694 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553014-d5fgg"] Mar 10 22:20:07 crc kubenswrapper[4919]: I0310 22:20:07.489953 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cad2d4-b151-4000-82ee-fed894ad117a" path="/var/lib/kubelet/pods/90cad2d4-b151-4000-82ee-fed894ad117a/volumes" Mar 10 22:20:15 crc kubenswrapper[4919]: I0310 22:20:15.480003 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:20:15 crc kubenswrapper[4919]: E0310 22:20:15.480872 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:20:27 crc kubenswrapper[4919]: I0310 22:20:27.480034 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:20:27 crc kubenswrapper[4919]: E0310 22:20:27.480877 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:20:39 crc kubenswrapper[4919]: I0310 22:20:39.480101 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:20:39 crc kubenswrapper[4919]: E0310 22:20:39.480942 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:20:42 crc kubenswrapper[4919]: I0310 22:20:42.950517 4919 scope.go:117] "RemoveContainer" containerID="e21cf6248decf29aa992c186881d057311c63ce4f642d28d2d190eaeaf041479" Mar 10 22:20:42 crc kubenswrapper[4919]: I0310 22:20:42.972619 4919 scope.go:117] "RemoveContainer" containerID="7d17cb532f3759c11c504765e96e40250ed2fc93714c74cd7c4b67a68350db0b" Mar 10 22:20:42 crc kubenswrapper[4919]: I0310 22:20:42.991077 4919 scope.go:117] "RemoveContainer" containerID="22b51ffd48cd3ea852a7427f1dfc881e19ae51a0a5028680360ef0179dcc54e1" Mar 10 22:20:43 crc kubenswrapper[4919]: I0310 22:20:43.020402 4919 scope.go:117] "RemoveContainer" containerID="fcb0215525c257f40fbb026eda215eee7e909386a54b513f6d4e594c4b7c8077" Mar 10 22:20:43 crc kubenswrapper[4919]: I0310 22:20:43.061933 4919 scope.go:117] "RemoveContainer" containerID="c87e48a8d3a96530e75f3196af093428f39eb925905d84a7251111e71248682b" Mar 10 22:20:43 crc kubenswrapper[4919]: I0310 22:20:43.101205 4919 scope.go:117] "RemoveContainer" containerID="92cf38036c784cc198b471c45b92409cdc090c7ce641ef5b3d72746ab8ddf341" Mar 10 22:20:51 crc kubenswrapper[4919]: I0310 22:20:51.480189 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:20:51 crc kubenswrapper[4919]: E0310 22:20:51.481063 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:21:03 crc kubenswrapper[4919]: I0310 22:21:03.485946 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:21:04 crc kubenswrapper[4919]: I0310 22:21:04.237158 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"140c87708c6fcfea09e12588fc5a3cd15890bdc32f129d66ab0f0f4f6ace9d9e"} Mar 10 22:21:43 crc kubenswrapper[4919]: I0310 22:21:43.239821 4919 scope.go:117] "RemoveContainer" containerID="c84f2ef693d15394b292c9104df5d25964c0ba501f0d8a5f3f3a4710e2c7b5e1" Mar 10 22:21:43 crc kubenswrapper[4919]: I0310 22:21:43.269661 4919 scope.go:117] "RemoveContainer" containerID="4f27c36666ba7ecf2d24cedee59efe2a08c7b4e6c86f4fe4198918504c6bf578" Mar 10 22:21:43 crc kubenswrapper[4919]: I0310 22:21:43.294373 4919 scope.go:117] "RemoveContainer" containerID="8dd9c6db1ef3f3090c173c377cd48fad2c1b903961ef2c219973ba650ae92aa3" Mar 10 22:21:43 crc kubenswrapper[4919]: I0310 22:21:43.317078 4919 scope.go:117] "RemoveContainer" containerID="77810b20846ff06e6e2286529394046c226fcdd21d8e8615e097fa35b3579513" Mar 10 22:21:43 crc kubenswrapper[4919]: I0310 22:21:43.337234 4919 scope.go:117] "RemoveContainer" containerID="7f26055d95c56893f487c89ea638cb9eb78e36cad1498490bf8ab54a3e7bac8c" Mar 10 22:21:43 crc kubenswrapper[4919]: I0310 22:21:43.369014 4919 scope.go:117] "RemoveContainer" containerID="0d25c07eec1b4976670c75603fd5da5476a97bed8734c71857c7dda9c1fa75bb" Mar 10 22:21:43 crc kubenswrapper[4919]: I0310 22:21:43.389189 4919 scope.go:117] "RemoveContainer" containerID="69fdc3c8e2ab199f6bb93d9e3a1a78edfb949241e41f07473825a631528f1dde" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.154841 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553022-2hnzl"] Mar 10 22:22:00 crc kubenswrapper[4919]: E0310 22:22:00.156238 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada4c843-4330-4fb1-9905-4bf620b86429" containerName="oc" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.156260 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada4c843-4330-4fb1-9905-4bf620b86429" containerName="oc" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.156528 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada4c843-4330-4fb1-9905-4bf620b86429" containerName="oc" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.157464 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553022-2hnzl" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.161421 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.161923 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.161933 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.167980 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553022-2hnzl"] Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.206104 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fgf\" (UniqueName: \"kubernetes.io/projected/1049b5a8-2f4b-4797-b484-7a151abab4bb-kube-api-access-46fgf\") pod \"auto-csr-approver-29553022-2hnzl\" (UID: \"1049b5a8-2f4b-4797-b484-7a151abab4bb\") " pod="openshift-infra/auto-csr-approver-29553022-2hnzl" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.307895 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46fgf\" (UniqueName: \"kubernetes.io/projected/1049b5a8-2f4b-4797-b484-7a151abab4bb-kube-api-access-46fgf\") pod \"auto-csr-approver-29553022-2hnzl\" (UID: \"1049b5a8-2f4b-4797-b484-7a151abab4bb\") " pod="openshift-infra/auto-csr-approver-29553022-2hnzl" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.327050 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fgf\" (UniqueName: \"kubernetes.io/projected/1049b5a8-2f4b-4797-b484-7a151abab4bb-kube-api-access-46fgf\") pod \"auto-csr-approver-29553022-2hnzl\" (UID: \"1049b5a8-2f4b-4797-b484-7a151abab4bb\") " pod="openshift-infra/auto-csr-approver-29553022-2hnzl" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.488520 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553022-2hnzl" Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.934466 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553022-2hnzl"] Mar 10 22:22:00 crc kubenswrapper[4919]: W0310 22:22:00.947130 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1049b5a8_2f4b_4797_b484_7a151abab4bb.slice/crio-f5598111cdcda113bcfb3272442eb2a63f969b04d9447d5a90927ee5e828f398 WatchSource:0}: Error finding container f5598111cdcda113bcfb3272442eb2a63f969b04d9447d5a90927ee5e828f398: Status 404 returned error can't find the container with id f5598111cdcda113bcfb3272442eb2a63f969b04d9447d5a90927ee5e828f398 Mar 10 22:22:00 crc kubenswrapper[4919]: I0310 22:22:00.950317 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:22:01 crc kubenswrapper[4919]: I0310 22:22:01.753412 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553022-2hnzl" event={"ID":"1049b5a8-2f4b-4797-b484-7a151abab4bb","Type":"ContainerStarted","Data":"f5598111cdcda113bcfb3272442eb2a63f969b04d9447d5a90927ee5e828f398"} Mar 10 22:22:02 crc kubenswrapper[4919]: I0310 22:22:02.769671 4919 generic.go:334] "Generic (PLEG): container finished" podID="1049b5a8-2f4b-4797-b484-7a151abab4bb" containerID="a9bd416f7458468cdb52e97ef6ae631a4dad2182192fd7cba36b15232278973d" exitCode=0 Mar 10 22:22:02 crc kubenswrapper[4919]: I0310 22:22:02.769905 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553022-2hnzl" event={"ID":"1049b5a8-2f4b-4797-b484-7a151abab4bb","Type":"ContainerDied","Data":"a9bd416f7458468cdb52e97ef6ae631a4dad2182192fd7cba36b15232278973d"} Mar 10 22:22:04 crc kubenswrapper[4919]: I0310 22:22:04.080445 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553022-2hnzl" Mar 10 22:22:04 crc kubenswrapper[4919]: I0310 22:22:04.163153 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46fgf\" (UniqueName: \"kubernetes.io/projected/1049b5a8-2f4b-4797-b484-7a151abab4bb-kube-api-access-46fgf\") pod \"1049b5a8-2f4b-4797-b484-7a151abab4bb\" (UID: \"1049b5a8-2f4b-4797-b484-7a151abab4bb\") " Mar 10 22:22:04 crc kubenswrapper[4919]: I0310 22:22:04.168767 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1049b5a8-2f4b-4797-b484-7a151abab4bb-kube-api-access-46fgf" (OuterVolumeSpecName: "kube-api-access-46fgf") pod "1049b5a8-2f4b-4797-b484-7a151abab4bb" (UID: "1049b5a8-2f4b-4797-b484-7a151abab4bb"). InnerVolumeSpecName "kube-api-access-46fgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:22:04 crc kubenswrapper[4919]: I0310 22:22:04.265122 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46fgf\" (UniqueName: \"kubernetes.io/projected/1049b5a8-2f4b-4797-b484-7a151abab4bb-kube-api-access-46fgf\") on node \"crc\" DevicePath \"\"" Mar 10 22:22:04 crc kubenswrapper[4919]: I0310 22:22:04.784727 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553022-2hnzl" event={"ID":"1049b5a8-2f4b-4797-b484-7a151abab4bb","Type":"ContainerDied","Data":"f5598111cdcda113bcfb3272442eb2a63f969b04d9447d5a90927ee5e828f398"} Mar 10 22:22:04 crc kubenswrapper[4919]: I0310 22:22:04.784771 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5598111cdcda113bcfb3272442eb2a63f969b04d9447d5a90927ee5e828f398" Mar 10 22:22:04 crc kubenswrapper[4919]: I0310 22:22:04.784809 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553022-2hnzl" Mar 10 22:22:05 crc kubenswrapper[4919]: I0310 22:22:05.147201 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553016-d8blk"] Mar 10 22:22:05 crc kubenswrapper[4919]: I0310 22:22:05.151737 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553016-d8blk"] Mar 10 22:22:05 crc kubenswrapper[4919]: I0310 22:22:05.488227 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8a7c94-9bde-4dfc-9172-9c116e26b70e" path="/var/lib/kubelet/pods/5d8a7c94-9bde-4dfc-9172-9c116e26b70e/volumes" Mar 10 22:22:43 crc kubenswrapper[4919]: I0310 22:22:43.452192 4919 scope.go:117] "RemoveContainer" containerID="09ab8817bad628ef5a55dd8b4b7607fccf7bbf117c5ce90c546f8cebf64c396d" Mar 10 22:23:29 crc kubenswrapper[4919]: I0310 22:23:29.176368 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:23:29 crc kubenswrapper[4919]: I0310 22:23:29.177050 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:23:59 crc kubenswrapper[4919]: I0310 22:23:59.176167 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:23:59 crc kubenswrapper[4919]: I0310 22:23:59.177125 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.141202 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553024-l2tsb"] Mar 10 22:24:00 crc kubenswrapper[4919]: E0310 22:24:00.141872 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1049b5a8-2f4b-4797-b484-7a151abab4bb" containerName="oc" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.141883 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1049b5a8-2f4b-4797-b484-7a151abab4bb" containerName="oc" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.142029 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1049b5a8-2f4b-4797-b484-7a151abab4bb" containerName="oc" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.142574 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553024-l2tsb" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.146279 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.146370 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.146511 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.163542 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553024-l2tsb"] Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.259710 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv6l8\" (UniqueName: \"kubernetes.io/projected/0840e0fd-6127-44e1-a420-c0a9107ed81a-kube-api-access-kv6l8\") pod \"auto-csr-approver-29553024-l2tsb\" (UID: \"0840e0fd-6127-44e1-a420-c0a9107ed81a\") " pod="openshift-infra/auto-csr-approver-29553024-l2tsb" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.360820 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv6l8\" (UniqueName: \"kubernetes.io/projected/0840e0fd-6127-44e1-a420-c0a9107ed81a-kube-api-access-kv6l8\") pod \"auto-csr-approver-29553024-l2tsb\" (UID: \"0840e0fd-6127-44e1-a420-c0a9107ed81a\") " pod="openshift-infra/auto-csr-approver-29553024-l2tsb" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.398621 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv6l8\" (UniqueName: \"kubernetes.io/projected/0840e0fd-6127-44e1-a420-c0a9107ed81a-kube-api-access-kv6l8\") pod \"auto-csr-approver-29553024-l2tsb\" (UID: \"0840e0fd-6127-44e1-a420-c0a9107ed81a\") " pod="openshift-infra/auto-csr-approver-29553024-l2tsb" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.470755 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553024-l2tsb" Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.897860 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553024-l2tsb"] Mar 10 22:24:00 crc kubenswrapper[4919]: I0310 22:24:00.916478 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553024-l2tsb" event={"ID":"0840e0fd-6127-44e1-a420-c0a9107ed81a","Type":"ContainerStarted","Data":"dbdde52ab7a0064f2ed2feba852c9fb34f5cacef8f73c7c2c3fa839711688b0b"} Mar 10 22:24:02 crc kubenswrapper[4919]: I0310 22:24:02.936530 4919 generic.go:334] "Generic (PLEG): container finished" podID="0840e0fd-6127-44e1-a420-c0a9107ed81a" containerID="250be72ff0c58f3c17e5f2b0b03e8d7d9c00b768fb62e47d91f28752ab449b81" exitCode=0 Mar 10 22:24:02 crc kubenswrapper[4919]: I0310 22:24:02.936640 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553024-l2tsb" event={"ID":"0840e0fd-6127-44e1-a420-c0a9107ed81a","Type":"ContainerDied","Data":"250be72ff0c58f3c17e5f2b0b03e8d7d9c00b768fb62e47d91f28752ab449b81"} Mar 10 22:24:04 crc kubenswrapper[4919]: I0310 22:24:04.342822 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553024-l2tsb" Mar 10 22:24:04 crc kubenswrapper[4919]: I0310 22:24:04.423950 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv6l8\" (UniqueName: \"kubernetes.io/projected/0840e0fd-6127-44e1-a420-c0a9107ed81a-kube-api-access-kv6l8\") pod \"0840e0fd-6127-44e1-a420-c0a9107ed81a\" (UID: \"0840e0fd-6127-44e1-a420-c0a9107ed81a\") " Mar 10 22:24:04 crc kubenswrapper[4919]: I0310 22:24:04.440550 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0840e0fd-6127-44e1-a420-c0a9107ed81a-kube-api-access-kv6l8" (OuterVolumeSpecName: "kube-api-access-kv6l8") pod "0840e0fd-6127-44e1-a420-c0a9107ed81a" (UID: "0840e0fd-6127-44e1-a420-c0a9107ed81a"). InnerVolumeSpecName "kube-api-access-kv6l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:24:04 crc kubenswrapper[4919]: I0310 22:24:04.532280 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv6l8\" (UniqueName: \"kubernetes.io/projected/0840e0fd-6127-44e1-a420-c0a9107ed81a-kube-api-access-kv6l8\") on node \"crc\" DevicePath \"\"" Mar 10 22:24:04 crc kubenswrapper[4919]: I0310 22:24:04.955715 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553024-l2tsb" event={"ID":"0840e0fd-6127-44e1-a420-c0a9107ed81a","Type":"ContainerDied","Data":"dbdde52ab7a0064f2ed2feba852c9fb34f5cacef8f73c7c2c3fa839711688b0b"} Mar 10 22:24:04 crc kubenswrapper[4919]: I0310 22:24:04.956286 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbdde52ab7a0064f2ed2feba852c9fb34f5cacef8f73c7c2c3fa839711688b0b" Mar 10 22:24:04 crc kubenswrapper[4919]: I0310 22:24:04.955773 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553024-l2tsb" Mar 10 22:24:05 crc kubenswrapper[4919]: I0310 22:24:05.412637 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553018-9b8dk"] Mar 10 22:24:05 crc kubenswrapper[4919]: I0310 22:24:05.417322 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553018-9b8dk"] Mar 10 22:24:05 crc kubenswrapper[4919]: I0310 22:24:05.489144 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ace4ea-6c97-47a9-bdc2-bf1d4a75d030" path="/var/lib/kubelet/pods/83ace4ea-6c97-47a9-bdc2-bf1d4a75d030/volumes" Mar 10 22:24:29 crc kubenswrapper[4919]: I0310 22:24:29.175767 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:24:29 crc kubenswrapper[4919]: I0310 22:24:29.176550 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:24:29 crc kubenswrapper[4919]: I0310 22:24:29.176612 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:24:29 crc kubenswrapper[4919]: I0310 22:24:29.177515 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"140c87708c6fcfea09e12588fc5a3cd15890bdc32f129d66ab0f0f4f6ace9d9e"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:24:29 crc kubenswrapper[4919]: I0310 22:24:29.177606 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://140c87708c6fcfea09e12588fc5a3cd15890bdc32f129d66ab0f0f4f6ace9d9e" gracePeriod=600 Mar 10 22:24:30 crc kubenswrapper[4919]: I0310 22:24:30.149353 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="140c87708c6fcfea09e12588fc5a3cd15890bdc32f129d66ab0f0f4f6ace9d9e" exitCode=0 Mar 10 22:24:30 crc kubenswrapper[4919]: I0310 22:24:30.149485 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"140c87708c6fcfea09e12588fc5a3cd15890bdc32f129d66ab0f0f4f6ace9d9e"} Mar 10 22:24:30 crc kubenswrapper[4919]: I0310 22:24:30.149785 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b"} Mar 10 22:24:30 crc kubenswrapper[4919]: I0310 22:24:30.149811 4919 scope.go:117] "RemoveContainer" containerID="fce2ab31f6ae341422fcdee59d32194b41ef9122dd92f9a1264d329e9e490637" Mar 10 22:24:43 crc kubenswrapper[4919]: I0310 22:24:43.555062 4919 scope.go:117] "RemoveContainer" containerID="cbef5380b0f4b3c785f97e656aac3c8c7514482b21b0c4b038d3103bd0f07797" Mar 10 22:25:14 crc kubenswrapper[4919]: I0310 22:25:14.985050 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5gcl"] Mar 10 22:25:14 crc kubenswrapper[4919]: E0310 22:25:14.991165 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0840e0fd-6127-44e1-a420-c0a9107ed81a" containerName="oc" Mar 10 22:25:14 crc kubenswrapper[4919]: I0310 22:25:14.991224 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="0840e0fd-6127-44e1-a420-c0a9107ed81a" containerName="oc" Mar 10 22:25:14 crc kubenswrapper[4919]: I0310 22:25:14.991624 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="0840e0fd-6127-44e1-a420-c0a9107ed81a" containerName="oc" Mar 10 22:25:14 crc kubenswrapper[4919]: I0310 22:25:14.993988 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.003018 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5gcl"] Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.137733 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6580e1d-0db8-4e06-9690-4fca67b2604a-utilities\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.137794 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4lwg\" (UniqueName: \"kubernetes.io/projected/f6580e1d-0db8-4e06-9690-4fca67b2604a-kube-api-access-m4lwg\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.137888 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6580e1d-0db8-4e06-9690-4fca67b2604a-catalog-content\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.170780 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zrb9p"] Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.173157 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.181663 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrb9p"] Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.238886 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6580e1d-0db8-4e06-9690-4fca67b2604a-catalog-content\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.239042 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6580e1d-0db8-4e06-9690-4fca67b2604a-utilities\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.239096 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4lwg\" (UniqueName: \"kubernetes.io/projected/f6580e1d-0db8-4e06-9690-4fca67b2604a-kube-api-access-m4lwg\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.239749 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6580e1d-0db8-4e06-9690-4fca67b2604a-utilities\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.239869 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6580e1d-0db8-4e06-9690-4fca67b2604a-catalog-content\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.268724 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4lwg\" (UniqueName: \"kubernetes.io/projected/f6580e1d-0db8-4e06-9690-4fca67b2604a-kube-api-access-m4lwg\") pod \"redhat-operators-w5gcl\" (UID: \"f6580e1d-0db8-4e06-9690-4fca67b2604a\") " pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.338065 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.340059 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-catalog-content\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.340133 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg49k\" (UniqueName: \"kubernetes.io/projected/79b4ee48-1e52-4bef-a298-31cba2ec8539-kube-api-access-tg49k\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.340227 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-utilities\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.441523 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg49k\" (UniqueName: \"kubernetes.io/projected/79b4ee48-1e52-4bef-a298-31cba2ec8539-kube-api-access-tg49k\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.441956 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-utilities\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.442042 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-catalog-content\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.442519 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-utilities\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.442621 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-catalog-content\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.480519 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg49k\" (UniqueName: \"kubernetes.io/projected/79b4ee48-1e52-4bef-a298-31cba2ec8539-kube-api-access-tg49k\") pod \"community-operators-zrb9p\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.496768 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:15 crc kubenswrapper[4919]: I0310 22:25:15.803473 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5gcl"] Mar 10 22:25:16 crc kubenswrapper[4919]: I0310 22:25:16.062008 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrb9p"] Mar 10 22:25:16 crc kubenswrapper[4919]: W0310 22:25:16.065892 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b4ee48_1e52_4bef_a298_31cba2ec8539.slice/crio-162133f8d8c0462447d117d0f7436be94d9e464c9519604f41570c2efff9e870 WatchSource:0}: Error finding container 162133f8d8c0462447d117d0f7436be94d9e464c9519604f41570c2efff9e870: Status 404 returned error can't find the container with id 162133f8d8c0462447d117d0f7436be94d9e464c9519604f41570c2efff9e870 Mar 10 22:25:16 crc kubenswrapper[4919]: I0310 22:25:16.553296 4919 generic.go:334] "Generic (PLEG): container finished" podID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerID="e48dee694713b16b033070e980d558f914b922ea8481bebb4c389e2877eac812" exitCode=0 Mar 10 22:25:16 crc kubenswrapper[4919]: I0310 22:25:16.553339 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrb9p" event={"ID":"79b4ee48-1e52-4bef-a298-31cba2ec8539","Type":"ContainerDied","Data":"e48dee694713b16b033070e980d558f914b922ea8481bebb4c389e2877eac812"} Mar 10 22:25:16 crc kubenswrapper[4919]: I0310 22:25:16.553649 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrb9p" event={"ID":"79b4ee48-1e52-4bef-a298-31cba2ec8539","Type":"ContainerStarted","Data":"162133f8d8c0462447d117d0f7436be94d9e464c9519604f41570c2efff9e870"} Mar 10 22:25:16 crc kubenswrapper[4919]: I0310 22:25:16.557445 4919 generic.go:334] "Generic (PLEG): container finished" podID="f6580e1d-0db8-4e06-9690-4fca67b2604a" containerID="74266c7c9778968256a9e50e8136a5d90c86f6849cefb08cbe0caa24c298d63b" exitCode=0 Mar 10 22:25:16 crc kubenswrapper[4919]: I0310 22:25:16.557481 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5gcl" event={"ID":"f6580e1d-0db8-4e06-9690-4fca67b2604a","Type":"ContainerDied","Data":"74266c7c9778968256a9e50e8136a5d90c86f6849cefb08cbe0caa24c298d63b"} Mar 10 22:25:16 crc kubenswrapper[4919]: I0310 22:25:16.557528 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5gcl" event={"ID":"f6580e1d-0db8-4e06-9690-4fca67b2604a","Type":"ContainerStarted","Data":"ccb9b4e21b84aaf08b49d06b4b2c968092d62b1d18526e0ea92783ea190fe7f6"} Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.384275 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wsb7s"] Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.394015 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.396071 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsb7s"] Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.475370 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-utilities\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.475689 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-catalog-content\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.475851 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sb9c\" (UniqueName: \"kubernetes.io/projected/75735232-0764-4c66-a626-ac25c42dd73d-kube-api-access-6sb9c\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.565870 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrb9p" event={"ID":"79b4ee48-1e52-4bef-a298-31cba2ec8539","Type":"ContainerStarted","Data":"a104f724b099ae36b1988c255bc4057bd272b688f0a05bffb17fcae30ed88f46"} Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.578645 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sb9c\" (UniqueName: \"kubernetes.io/projected/75735232-0764-4c66-a626-ac25c42dd73d-kube-api-access-6sb9c\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.578914 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-utilities\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.579067 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-catalog-content\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.579495 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-utilities\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.579723 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-catalog-content\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.603417 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sb9c\" (UniqueName: \"kubernetes.io/projected/75735232-0764-4c66-a626-ac25c42dd73d-kube-api-access-6sb9c\") pod \"redhat-marketplace-wsb7s\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:17 crc kubenswrapper[4919]: I0310 22:25:17.713368 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:18 crc kubenswrapper[4919]: I0310 22:25:18.178176 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsb7s"] Mar 10 22:25:18 crc kubenswrapper[4919]: W0310 22:25:18.188316 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75735232_0764_4c66_a626_ac25c42dd73d.slice/crio-37b8f74071987f5ce3db72fb837769f29355c40cf4b324cd2a7fbd8737a693ac WatchSource:0}: Error finding container 37b8f74071987f5ce3db72fb837769f29355c40cf4b324cd2a7fbd8737a693ac: Status 404 returned error can't find the container with id 37b8f74071987f5ce3db72fb837769f29355c40cf4b324cd2a7fbd8737a693ac Mar 10 22:25:18 crc kubenswrapper[4919]: I0310 22:25:18.573928 4919 generic.go:334] "Generic (PLEG): container finished" podID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerID="a104f724b099ae36b1988c255bc4057bd272b688f0a05bffb17fcae30ed88f46" exitCode=0 Mar 10 22:25:18 crc kubenswrapper[4919]: I0310 22:25:18.573970 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrb9p" event={"ID":"79b4ee48-1e52-4bef-a298-31cba2ec8539","Type":"ContainerDied","Data":"a104f724b099ae36b1988c255bc4057bd272b688f0a05bffb17fcae30ed88f46"} Mar 10 22:25:18 crc kubenswrapper[4919]: I0310 22:25:18.577014 4919 generic.go:334] "Generic (PLEG): container finished" podID="75735232-0764-4c66-a626-ac25c42dd73d" containerID="bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956" exitCode=0 Mar 10 22:25:18 crc kubenswrapper[4919]: I0310 22:25:18.577053 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsb7s" event={"ID":"75735232-0764-4c66-a626-ac25c42dd73d","Type":"ContainerDied","Data":"bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956"} Mar 10 22:25:18 crc kubenswrapper[4919]: I0310 22:25:18.577076 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsb7s" event={"ID":"75735232-0764-4c66-a626-ac25c42dd73d","Type":"ContainerStarted","Data":"37b8f74071987f5ce3db72fb837769f29355c40cf4b324cd2a7fbd8737a693ac"} Mar 10 22:25:19 crc kubenswrapper[4919]: I0310 22:25:19.588584 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrb9p" event={"ID":"79b4ee48-1e52-4bef-a298-31cba2ec8539","Type":"ContainerStarted","Data":"dad8b8269dd707827e9ee966269f505a04190e0922807e317e98d31f74855e51"} Mar 10 22:25:19 crc kubenswrapper[4919]: I0310 22:25:19.606677 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zrb9p" podStartSLOduration=2.172784634 podStartE2EDuration="4.606660642s" podCreationTimestamp="2026-03-10 22:25:15 +0000 UTC" firstStartedPulling="2026-03-10 22:25:16.555500015 +0000 UTC m=+2103.797380623" lastFinishedPulling="2026-03-10 22:25:18.989376033 +0000 UTC m=+2106.231256631" observedRunningTime="2026-03-10 22:25:19.604595285 +0000 UTC m=+2106.846475893" watchObservedRunningTime="2026-03-10 22:25:19.606660642 +0000 UTC m=+2106.848541270" Mar 10 22:25:20 crc kubenswrapper[4919]: I0310 22:25:20.601651 4919 generic.go:334] "Generic (PLEG): container finished" podID="75735232-0764-4c66-a626-ac25c42dd73d" containerID="2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2" exitCode=0 Mar 10 22:25:20 crc kubenswrapper[4919]: I0310 22:25:20.602795 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsb7s" event={"ID":"75735232-0764-4c66-a626-ac25c42dd73d","Type":"ContainerDied","Data":"2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2"} Mar 10 22:25:24 crc kubenswrapper[4919]: I0310 22:25:24.630382 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5gcl" event={"ID":"f6580e1d-0db8-4e06-9690-4fca67b2604a","Type":"ContainerStarted","Data":"175db2e35f01c85f14cc5226036d9547c8175dd4cf21ec532ad4595d83b65859"} Mar 10 22:25:24 crc kubenswrapper[4919]: I0310 22:25:24.632707 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsb7s" event={"ID":"75735232-0764-4c66-a626-ac25c42dd73d","Type":"ContainerStarted","Data":"972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add"} Mar 10 22:25:24 crc kubenswrapper[4919]: I0310 22:25:24.671318 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wsb7s" podStartSLOduration=2.199691212 podStartE2EDuration="7.671300473s" podCreationTimestamp="2026-03-10 22:25:17 +0000 UTC" firstStartedPulling="2026-03-10 22:25:18.579490185 +0000 UTC m=+2105.821370793" lastFinishedPulling="2026-03-10 22:25:24.051099446 +0000 UTC m=+2111.292980054" observedRunningTime="2026-03-10 22:25:24.666697309 +0000 UTC m=+2111.908577927" watchObservedRunningTime="2026-03-10 22:25:24.671300473 +0000 UTC m=+2111.913181081" Mar 10 22:25:25 crc kubenswrapper[4919]: I0310 22:25:25.497267 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:25 crc kubenswrapper[4919]: I0310 22:25:25.497317 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:25 crc kubenswrapper[4919]: I0310 22:25:25.550731 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:25 crc kubenswrapper[4919]: I0310 22:25:25.642362 4919 generic.go:334] "Generic (PLEG): container finished" podID="f6580e1d-0db8-4e06-9690-4fca67b2604a" containerID="175db2e35f01c85f14cc5226036d9547c8175dd4cf21ec532ad4595d83b65859" exitCode=0 Mar 10 22:25:25 crc kubenswrapper[4919]: I0310 22:25:25.642481 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5gcl" event={"ID":"f6580e1d-0db8-4e06-9690-4fca67b2604a","Type":"ContainerDied","Data":"175db2e35f01c85f14cc5226036d9547c8175dd4cf21ec532ad4595d83b65859"} Mar 10 22:25:25 crc kubenswrapper[4919]: I0310 22:25:25.686022 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:26 crc kubenswrapper[4919]: I0310 22:25:26.159527 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrb9p"] Mar 10 22:25:27 crc kubenswrapper[4919]: I0310 22:25:27.656353 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zrb9p" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerName="registry-server" containerID="cri-o://dad8b8269dd707827e9ee966269f505a04190e0922807e317e98d31f74855e51" gracePeriod=2 Mar 10 22:25:27 crc kubenswrapper[4919]: I0310 22:25:27.713998 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:27 crc kubenswrapper[4919]: I0310 22:25:27.714080 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:27 crc kubenswrapper[4919]: I0310 22:25:27.782857 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:28 crc kubenswrapper[4919]: I0310 22:25:28.664486 4919 generic.go:334] "Generic (PLEG): container finished" podID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerID="dad8b8269dd707827e9ee966269f505a04190e0922807e317e98d31f74855e51" exitCode=0 Mar 10 22:25:28 crc kubenswrapper[4919]: I0310 22:25:28.665169 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrb9p" event={"ID":"79b4ee48-1e52-4bef-a298-31cba2ec8539","Type":"ContainerDied","Data":"dad8b8269dd707827e9ee966269f505a04190e0922807e317e98d31f74855e51"} Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.074075 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.247359 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg49k\" (UniqueName: \"kubernetes.io/projected/79b4ee48-1e52-4bef-a298-31cba2ec8539-kube-api-access-tg49k\") pod \"79b4ee48-1e52-4bef-a298-31cba2ec8539\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.247730 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-utilities\") pod \"79b4ee48-1e52-4bef-a298-31cba2ec8539\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.247772 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-catalog-content\") pod \"79b4ee48-1e52-4bef-a298-31cba2ec8539\" (UID: \"79b4ee48-1e52-4bef-a298-31cba2ec8539\") " Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.248639 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-utilities" (OuterVolumeSpecName: "utilities") pod "79b4ee48-1e52-4bef-a298-31cba2ec8539" (UID: "79b4ee48-1e52-4bef-a298-31cba2ec8539"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.255665 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b4ee48-1e52-4bef-a298-31cba2ec8539-kube-api-access-tg49k" (OuterVolumeSpecName: "kube-api-access-tg49k") pod "79b4ee48-1e52-4bef-a298-31cba2ec8539" (UID: "79b4ee48-1e52-4bef-a298-31cba2ec8539"). InnerVolumeSpecName "kube-api-access-tg49k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.299454 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79b4ee48-1e52-4bef-a298-31cba2ec8539" (UID: "79b4ee48-1e52-4bef-a298-31cba2ec8539"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.351689 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.351784 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b4ee48-1e52-4bef-a298-31cba2ec8539-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.351808 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg49k\" (UniqueName: \"kubernetes.io/projected/79b4ee48-1e52-4bef-a298-31cba2ec8539-kube-api-access-tg49k\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.674080 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrb9p" event={"ID":"79b4ee48-1e52-4bef-a298-31cba2ec8539","Type":"ContainerDied","Data":"162133f8d8c0462447d117d0f7436be94d9e464c9519604f41570c2efff9e870"} Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.674153 4919 scope.go:117] "RemoveContainer" containerID="dad8b8269dd707827e9ee966269f505a04190e0922807e317e98d31f74855e51" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.674142 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrb9p" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.676590 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5gcl" event={"ID":"f6580e1d-0db8-4e06-9690-4fca67b2604a","Type":"ContainerStarted","Data":"938aaf22d2676ef28891bbe9112809b1a1dd8976711a1a1f87d1228f98fad007"} Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.689269 4919 scope.go:117] "RemoveContainer" containerID="a104f724b099ae36b1988c255bc4057bd272b688f0a05bffb17fcae30ed88f46" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.717454 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5gcl" podStartSLOduration=3.44053546 podStartE2EDuration="15.717424633s" podCreationTimestamp="2026-03-10 22:25:14 +0000 UTC" firstStartedPulling="2026-03-10 22:25:16.55900159 +0000 UTC m=+2103.800882198" lastFinishedPulling="2026-03-10 22:25:28.835890723 +0000 UTC m=+2116.077771371" observedRunningTime="2026-03-10 22:25:29.703312631 +0000 UTC m=+2116.945193239" watchObservedRunningTime="2026-03-10 22:25:29.717424633 +0000 UTC m=+2116.959305251" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.722928 4919 scope.go:117] "RemoveContainer" containerID="e48dee694713b16b033070e980d558f914b922ea8481bebb4c389e2877eac812" Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.725634 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrb9p"] Mar 10 22:25:29 crc kubenswrapper[4919]: I0310 22:25:29.732945 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zrb9p"] Mar 10 22:25:31 crc kubenswrapper[4919]: I0310 22:25:31.493719 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" path="/var/lib/kubelet/pods/79b4ee48-1e52-4bef-a298-31cba2ec8539/volumes" Mar 10 22:25:35 crc kubenswrapper[4919]: I0310 22:25:35.338704 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:35 crc kubenswrapper[4919]: I0310 22:25:35.339062 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:36 crc kubenswrapper[4919]: I0310 22:25:36.378442 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5gcl" podUID="f6580e1d-0db8-4e06-9690-4fca67b2604a" containerName="registry-server" probeResult="failure" output=< Mar 10 22:25:36 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 22:25:36 crc kubenswrapper[4919]: > Mar 10 22:25:37 crc kubenswrapper[4919]: I0310 22:25:37.770016 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:37 crc kubenswrapper[4919]: I0310 22:25:37.829256 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsb7s"] Mar 10 22:25:38 crc kubenswrapper[4919]: I0310 22:25:38.743879 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wsb7s" podUID="75735232-0764-4c66-a626-ac25c42dd73d" containerName="registry-server" containerID="cri-o://972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add" gracePeriod=2 Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.162444 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.293480 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-catalog-content\") pod \"75735232-0764-4c66-a626-ac25c42dd73d\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.293542 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-utilities\") pod \"75735232-0764-4c66-a626-ac25c42dd73d\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.293594 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sb9c\" (UniqueName: \"kubernetes.io/projected/75735232-0764-4c66-a626-ac25c42dd73d-kube-api-access-6sb9c\") pod \"75735232-0764-4c66-a626-ac25c42dd73d\" (UID: \"75735232-0764-4c66-a626-ac25c42dd73d\") " Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.295151 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-utilities" (OuterVolumeSpecName: "utilities") pod "75735232-0764-4c66-a626-ac25c42dd73d" (UID: "75735232-0764-4c66-a626-ac25c42dd73d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.299996 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75735232-0764-4c66-a626-ac25c42dd73d-kube-api-access-6sb9c" (OuterVolumeSpecName: "kube-api-access-6sb9c") pod "75735232-0764-4c66-a626-ac25c42dd73d" (UID: "75735232-0764-4c66-a626-ac25c42dd73d"). InnerVolumeSpecName "kube-api-access-6sb9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.320939 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75735232-0764-4c66-a626-ac25c42dd73d" (UID: "75735232-0764-4c66-a626-ac25c42dd73d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.395645 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.395907 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75735232-0764-4c66-a626-ac25c42dd73d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.395971 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sb9c\" (UniqueName: \"kubernetes.io/projected/75735232-0764-4c66-a626-ac25c42dd73d-kube-api-access-6sb9c\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.753638 4919 generic.go:334] "Generic (PLEG): container finished" podID="75735232-0764-4c66-a626-ac25c42dd73d" containerID="972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add" exitCode=0 Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.753750 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wsb7s" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.753736 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsb7s" event={"ID":"75735232-0764-4c66-a626-ac25c42dd73d","Type":"ContainerDied","Data":"972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add"} Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.754213 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsb7s" event={"ID":"75735232-0764-4c66-a626-ac25c42dd73d","Type":"ContainerDied","Data":"37b8f74071987f5ce3db72fb837769f29355c40cf4b324cd2a7fbd8737a693ac"} Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.754281 4919 scope.go:117] "RemoveContainer" containerID="972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.791623 4919 scope.go:117] "RemoveContainer" containerID="2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.792537 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsb7s"] Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.804454 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsb7s"] Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.821866 4919 scope.go:117] "RemoveContainer" containerID="bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.859309 4919 scope.go:117] "RemoveContainer" containerID="972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add" Mar 10 22:25:39 crc kubenswrapper[4919]: E0310 22:25:39.860462 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add\": container with ID starting with 972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add not found: ID does not exist" containerID="972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.860553 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add"} err="failed to get container status \"972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add\": rpc error: code = NotFound desc = could not find container \"972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add\": container with ID starting with 972407440e38160c564170265cc138b88e1e3f836a93dbbab439601bf4389add not found: ID does not exist" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.860594 4919 scope.go:117] "RemoveContainer" containerID="2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2" Mar 10 22:25:39 crc kubenswrapper[4919]: E0310 22:25:39.861220 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2\": container with ID starting with 2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2 not found: ID does not exist" containerID="2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.861266 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2"} err="failed to get container status \"2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2\": rpc error: code = NotFound desc = could not find container \"2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2\": container with ID starting with 2dad3289e547c85ccd5f601595013d8aac964a96b4395643df43951d0bade8d2 not found: ID does not exist" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.861293 4919 scope.go:117] "RemoveContainer" containerID="bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956" Mar 10 22:25:39 crc kubenswrapper[4919]: E0310 22:25:39.861764 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956\": container with ID starting with bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956 not found: ID does not exist" containerID="bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956" Mar 10 22:25:39 crc kubenswrapper[4919]: I0310 22:25:39.861809 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956"} err="failed to get container status \"bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956\": rpc error: code = NotFound desc = could not find container \"bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956\": container with ID starting with bba37ee15295a26659601587a9867c970e9f0c56006cb51d8711345e289c6956 not found: ID does not exist" Mar 10 22:25:41 crc kubenswrapper[4919]: I0310 22:25:41.496262 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75735232-0764-4c66-a626-ac25c42dd73d" path="/var/lib/kubelet/pods/75735232-0764-4c66-a626-ac25c42dd73d/volumes" Mar 10 22:25:45 crc kubenswrapper[4919]: I0310 22:25:45.403506 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:45 crc kubenswrapper[4919]: I0310 22:25:45.466266 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5gcl" Mar 10 22:25:47 crc kubenswrapper[4919]: I0310 22:25:47.432351 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5gcl"] Mar 10 22:25:47 crc kubenswrapper[4919]: I0310 22:25:47.785104 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b68ml"] Mar 10 22:25:47 crc kubenswrapper[4919]: I0310 22:25:47.824611 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b68ml" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="registry-server" containerID="cri-o://b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6" gracePeriod=2 Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.314630 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.437427 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7scp\" (UniqueName: \"kubernetes.io/projected/d9defb14-2db3-40d8-8081-495961bfedf1-kube-api-access-v7scp\") pod \"d9defb14-2db3-40d8-8081-495961bfedf1\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.437575 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-catalog-content\") pod \"d9defb14-2db3-40d8-8081-495961bfedf1\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.437605 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-utilities\") pod \"d9defb14-2db3-40d8-8081-495961bfedf1\" (UID: \"d9defb14-2db3-40d8-8081-495961bfedf1\") " Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.438297 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-utilities" (OuterVolumeSpecName: "utilities") pod "d9defb14-2db3-40d8-8081-495961bfedf1" (UID: "d9defb14-2db3-40d8-8081-495961bfedf1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.443060 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9defb14-2db3-40d8-8081-495961bfedf1-kube-api-access-v7scp" (OuterVolumeSpecName: "kube-api-access-v7scp") pod "d9defb14-2db3-40d8-8081-495961bfedf1" (UID: "d9defb14-2db3-40d8-8081-495961bfedf1"). InnerVolumeSpecName "kube-api-access-v7scp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.539460 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7scp\" (UniqueName: \"kubernetes.io/projected/d9defb14-2db3-40d8-8081-495961bfedf1-kube-api-access-v7scp\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.539711 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.540548 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9defb14-2db3-40d8-8081-495961bfedf1" (UID: "d9defb14-2db3-40d8-8081-495961bfedf1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.640825 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9defb14-2db3-40d8-8081-495961bfedf1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.831179 4919 generic.go:334] "Generic (PLEG): container finished" podID="d9defb14-2db3-40d8-8081-495961bfedf1" containerID="b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6" exitCode=0 Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.831256 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b68ml" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.831275 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b68ml" event={"ID":"d9defb14-2db3-40d8-8081-495961bfedf1","Type":"ContainerDied","Data":"b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6"} Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.831576 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b68ml" event={"ID":"d9defb14-2db3-40d8-8081-495961bfedf1","Type":"ContainerDied","Data":"2aaa98ac37889704c3ade6e5d1e4c8437eaa1ef52ea4401b9525268307f43b32"} Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.831599 4919 scope.go:117] "RemoveContainer" containerID="b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.849355 4919 scope.go:117] "RemoveContainer" containerID="96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.870447 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b68ml"] Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.884461 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b68ml"] Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.885178 4919 scope.go:117] "RemoveContainer" containerID="6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.898969 4919 scope.go:117] "RemoveContainer" containerID="b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6" Mar 10 22:25:48 crc kubenswrapper[4919]: E0310 22:25:48.899367 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6\": container with ID starting with b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6 not found: ID does not exist" containerID="b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.899420 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6"} err="failed to get container status \"b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6\": rpc error: code = NotFound desc = could not find container \"b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6\": container with ID starting with b30e9151f5abdf9382dbbd92ae917b170d630e92200e9a940eac39f45077f0e6 not found: ID does not exist" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.899443 4919 scope.go:117] "RemoveContainer" containerID="96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370" Mar 10 22:25:48 crc kubenswrapper[4919]: E0310 22:25:48.899663 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370\": container with ID starting with 96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370 not found: ID does not exist" containerID="96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.899690 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370"} err="failed to get container status \"96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370\": rpc error: code = NotFound desc = could not find container \"96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370\": container with ID starting with 96387a4bff33e40a30091d7e7b86b9d95cab4c4225b836af85d0cbd08c43e370 not found: ID does not exist" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.899704 4919 scope.go:117] "RemoveContainer" containerID="6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea" Mar 10 22:25:48 crc kubenswrapper[4919]: E0310 22:25:48.899917 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea\": container with ID starting with 6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea not found: ID does not exist" containerID="6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea" Mar 10 22:25:48 crc kubenswrapper[4919]: I0310 22:25:48.899937 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea"} err="failed to get container status \"6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea\": rpc error: code = NotFound desc = could not find container \"6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea\": container with ID starting with 6bb04e9e8d5f73d5a99b17deef98b937c1efc61790e1ff25d036d5ec5e4193ea not found: ID does not exist" Mar 10 22:25:49 crc kubenswrapper[4919]: I0310 22:25:49.490430 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" path="/var/lib/kubelet/pods/d9defb14-2db3-40d8-8081-495961bfedf1/volumes" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.170614 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553026-w49lv"] Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.171882 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerName="extract-utilities" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.171909 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerName="extract-utilities" Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.171940 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="extract-content" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.171951 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="extract-content" Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.171973 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="extract-utilities" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.171984 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="extract-utilities" Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.172016 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172027 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.172112 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75735232-0764-4c66-a626-ac25c42dd73d" containerName="extract-utilities" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172124 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="75735232-0764-4c66-a626-ac25c42dd73d" containerName="extract-utilities" Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.172146 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172156 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.172175 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerName="extract-content" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172185 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerName="extract-content" Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.172203 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75735232-0764-4c66-a626-ac25c42dd73d" containerName="extract-content" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172213 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="75735232-0764-4c66-a626-ac25c42dd73d" containerName="extract-content" Mar 10 22:26:00 crc kubenswrapper[4919]: E0310 22:26:00.172225 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75735232-0764-4c66-a626-ac25c42dd73d" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172236 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="75735232-0764-4c66-a626-ac25c42dd73d" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172488 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="75735232-0764-4c66-a626-ac25c42dd73d" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172507 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b4ee48-1e52-4bef-a298-31cba2ec8539" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.172536 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9defb14-2db3-40d8-8081-495961bfedf1" containerName="registry-server" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.173283 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553026-w49lv" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.180137 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.180190 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.180271 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.185065 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553026-w49lv"] Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.312668 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlxln\" (UniqueName: \"kubernetes.io/projected/071fb005-dfb2-4126-930a-483536734b4d-kube-api-access-xlxln\") pod \"auto-csr-approver-29553026-w49lv\" (UID: \"071fb005-dfb2-4126-930a-483536734b4d\") " pod="openshift-infra/auto-csr-approver-29553026-w49lv" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.414160 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlxln\" (UniqueName: \"kubernetes.io/projected/071fb005-dfb2-4126-930a-483536734b4d-kube-api-access-xlxln\") pod \"auto-csr-approver-29553026-w49lv\" (UID: \"071fb005-dfb2-4126-930a-483536734b4d\") " pod="openshift-infra/auto-csr-approver-29553026-w49lv" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.455083 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlxln\" (UniqueName: \"kubernetes.io/projected/071fb005-dfb2-4126-930a-483536734b4d-kube-api-access-xlxln\") pod \"auto-csr-approver-29553026-w49lv\" (UID: \"071fb005-dfb2-4126-930a-483536734b4d\") " pod="openshift-infra/auto-csr-approver-29553026-w49lv" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.515071 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553026-w49lv" Mar 10 22:26:00 crc kubenswrapper[4919]: I0310 22:26:00.953443 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553026-w49lv"] Mar 10 22:26:00 crc kubenswrapper[4919]: W0310 22:26:00.959712 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod071fb005_dfb2_4126_930a_483536734b4d.slice/crio-68951e944d60a8e9b12d185a9bb44b5fc162d603be1ee68a27d47a89ab49cf54 WatchSource:0}: Error finding container 68951e944d60a8e9b12d185a9bb44b5fc162d603be1ee68a27d47a89ab49cf54: Status 404 returned error can't find the container with id 68951e944d60a8e9b12d185a9bb44b5fc162d603be1ee68a27d47a89ab49cf54 Mar 10 22:26:01 crc kubenswrapper[4919]: I0310 22:26:01.970450 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553026-w49lv" event={"ID":"071fb005-dfb2-4126-930a-483536734b4d","Type":"ContainerStarted","Data":"68951e944d60a8e9b12d185a9bb44b5fc162d603be1ee68a27d47a89ab49cf54"} Mar 10 22:26:02 crc kubenswrapper[4919]: I0310 22:26:02.983120 4919 generic.go:334] "Generic (PLEG): container finished" podID="071fb005-dfb2-4126-930a-483536734b4d" containerID="b457d73d0c8fb90b5df667a6165bfeef8a9e25f1501b5e40b64f4ccf32992080" exitCode=0 Mar 10 22:26:02 crc kubenswrapper[4919]: I0310 22:26:02.983187 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553026-w49lv" event={"ID":"071fb005-dfb2-4126-930a-483536734b4d","Type":"ContainerDied","Data":"b457d73d0c8fb90b5df667a6165bfeef8a9e25f1501b5e40b64f4ccf32992080"} Mar 10 22:26:04 crc kubenswrapper[4919]: I0310 22:26:04.306326 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553026-w49lv" Mar 10 22:26:04 crc kubenswrapper[4919]: I0310 22:26:04.398701 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlxln\" (UniqueName: \"kubernetes.io/projected/071fb005-dfb2-4126-930a-483536734b4d-kube-api-access-xlxln\") pod \"071fb005-dfb2-4126-930a-483536734b4d\" (UID: \"071fb005-dfb2-4126-930a-483536734b4d\") " Mar 10 22:26:04 crc kubenswrapper[4919]: I0310 22:26:04.404009 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071fb005-dfb2-4126-930a-483536734b4d-kube-api-access-xlxln" (OuterVolumeSpecName: "kube-api-access-xlxln") pod "071fb005-dfb2-4126-930a-483536734b4d" (UID: "071fb005-dfb2-4126-930a-483536734b4d"). InnerVolumeSpecName "kube-api-access-xlxln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:26:04 crc kubenswrapper[4919]: I0310 22:26:04.500140 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlxln\" (UniqueName: \"kubernetes.io/projected/071fb005-dfb2-4126-930a-483536734b4d-kube-api-access-xlxln\") on node \"crc\" DevicePath \"\"" Mar 10 22:26:05 crc kubenswrapper[4919]: I0310 22:26:05.000410 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553026-w49lv" event={"ID":"071fb005-dfb2-4126-930a-483536734b4d","Type":"ContainerDied","Data":"68951e944d60a8e9b12d185a9bb44b5fc162d603be1ee68a27d47a89ab49cf54"} Mar 10 22:26:05 crc kubenswrapper[4919]: I0310 22:26:05.000457 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68951e944d60a8e9b12d185a9bb44b5fc162d603be1ee68a27d47a89ab49cf54" Mar 10 22:26:05 crc kubenswrapper[4919]: I0310 22:26:05.000468 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553026-w49lv" Mar 10 22:26:05 crc kubenswrapper[4919]: I0310 22:26:05.386249 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553020-6rjdv"] Mar 10 22:26:05 crc kubenswrapper[4919]: I0310 22:26:05.392856 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553020-6rjdv"] Mar 10 22:26:05 crc kubenswrapper[4919]: I0310 22:26:05.492693 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada4c843-4330-4fb1-9905-4bf620b86429" path="/var/lib/kubelet/pods/ada4c843-4330-4fb1-9905-4bf620b86429/volumes" Mar 10 22:26:29 crc kubenswrapper[4919]: I0310 22:26:29.175460 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:26:29 crc kubenswrapper[4919]: I0310 22:26:29.176034 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:26:43 crc kubenswrapper[4919]: I0310 22:26:43.670992 4919 scope.go:117] "RemoveContainer" containerID="a4378340f4b73f5db606a12c70fcc405834febee26a1309f682de200dbedcae7" Mar 10 22:26:59 crc kubenswrapper[4919]: I0310 22:26:59.175271 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:26:59 crc kubenswrapper[4919]: I0310 22:26:59.176588 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.175943 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.176623 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.176726 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.177678 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.177773 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" gracePeriod=600 Mar 10 22:27:29 crc kubenswrapper[4919]: E0310 22:27:29.309594 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.631203 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" exitCode=0 Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.631273 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b"} Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.631329 4919 scope.go:117] "RemoveContainer" containerID="140c87708c6fcfea09e12588fc5a3cd15890bdc32f129d66ab0f0f4f6ace9d9e" Mar 10 22:27:29 crc kubenswrapper[4919]: I0310 22:27:29.632481 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:27:29 crc kubenswrapper[4919]: E0310 22:27:29.633136 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:27:44 crc kubenswrapper[4919]: I0310 22:27:44.480266 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:27:44 crc kubenswrapper[4919]: E0310 22:27:44.481130 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:27:56 crc kubenswrapper[4919]: I0310 22:27:56.480327 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:27:56 crc kubenswrapper[4919]: E0310 22:27:56.481276 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.159617 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553028-c8cbw"] Mar 10 22:28:00 crc kubenswrapper[4919]: E0310 22:28:00.160294 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071fb005-dfb2-4126-930a-483536734b4d" containerName="oc" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.160308 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="071fb005-dfb2-4126-930a-483536734b4d" containerName="oc" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.160488 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="071fb005-dfb2-4126-930a-483536734b4d" containerName="oc" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.161080 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553028-c8cbw" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.169060 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.169101 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.169447 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.182113 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553028-c8cbw"] Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.216018 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbf58\" (UniqueName: \"kubernetes.io/projected/437d2054-282f-4479-9d27-cb2e855e0a0b-kube-api-access-wbf58\") pod \"auto-csr-approver-29553028-c8cbw\" (UID: \"437d2054-282f-4479-9d27-cb2e855e0a0b\") " pod="openshift-infra/auto-csr-approver-29553028-c8cbw" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.317558 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbf58\" (UniqueName: \"kubernetes.io/projected/437d2054-282f-4479-9d27-cb2e855e0a0b-kube-api-access-wbf58\") pod \"auto-csr-approver-29553028-c8cbw\" (UID: \"437d2054-282f-4479-9d27-cb2e855e0a0b\") " pod="openshift-infra/auto-csr-approver-29553028-c8cbw" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.340547 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbf58\" (UniqueName: \"kubernetes.io/projected/437d2054-282f-4479-9d27-cb2e855e0a0b-kube-api-access-wbf58\") pod \"auto-csr-approver-29553028-c8cbw\" (UID: \"437d2054-282f-4479-9d27-cb2e855e0a0b\") " pod="openshift-infra/auto-csr-approver-29553028-c8cbw" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.485427 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553028-c8cbw" Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.938608 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553028-c8cbw"] Mar 10 22:28:00 crc kubenswrapper[4919]: I0310 22:28:00.948261 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:28:01 crc kubenswrapper[4919]: I0310 22:28:01.902627 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553028-c8cbw" event={"ID":"437d2054-282f-4479-9d27-cb2e855e0a0b","Type":"ContainerStarted","Data":"265284767c450c09e704790234ec58e3469881b94ba66700e58597d24653d732"} Mar 10 22:28:04 crc kubenswrapper[4919]: I0310 22:28:04.929412 4919 generic.go:334] "Generic (PLEG): container finished" podID="437d2054-282f-4479-9d27-cb2e855e0a0b" containerID="e107ebb513fc3028216e4766060a3facd4b18f5c500d45945c0eefef1126aaa0" exitCode=0 Mar 10 22:28:04 crc kubenswrapper[4919]: I0310 22:28:04.929503 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553028-c8cbw" event={"ID":"437d2054-282f-4479-9d27-cb2e855e0a0b","Type":"ContainerDied","Data":"e107ebb513fc3028216e4766060a3facd4b18f5c500d45945c0eefef1126aaa0"} Mar 10 22:28:06 crc kubenswrapper[4919]: I0310 22:28:06.223376 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553028-c8cbw" Mar 10 22:28:06 crc kubenswrapper[4919]: I0310 22:28:06.413668 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbf58\" (UniqueName: \"kubernetes.io/projected/437d2054-282f-4479-9d27-cb2e855e0a0b-kube-api-access-wbf58\") pod \"437d2054-282f-4479-9d27-cb2e855e0a0b\" (UID: \"437d2054-282f-4479-9d27-cb2e855e0a0b\") " Mar 10 22:28:06 crc kubenswrapper[4919]: I0310 22:28:06.420764 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437d2054-282f-4479-9d27-cb2e855e0a0b-kube-api-access-wbf58" (OuterVolumeSpecName: "kube-api-access-wbf58") pod "437d2054-282f-4479-9d27-cb2e855e0a0b" (UID: "437d2054-282f-4479-9d27-cb2e855e0a0b"). InnerVolumeSpecName "kube-api-access-wbf58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:28:06 crc kubenswrapper[4919]: I0310 22:28:06.514928 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbf58\" (UniqueName: \"kubernetes.io/projected/437d2054-282f-4479-9d27-cb2e855e0a0b-kube-api-access-wbf58\") on node \"crc\" DevicePath \"\"" Mar 10 22:28:06 crc kubenswrapper[4919]: I0310 22:28:06.947636 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553028-c8cbw" event={"ID":"437d2054-282f-4479-9d27-cb2e855e0a0b","Type":"ContainerDied","Data":"265284767c450c09e704790234ec58e3469881b94ba66700e58597d24653d732"} Mar 10 22:28:06 crc kubenswrapper[4919]: I0310 22:28:06.947892 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265284767c450c09e704790234ec58e3469881b94ba66700e58597d24653d732" Mar 10 22:28:06 crc kubenswrapper[4919]: I0310 22:28:06.947725 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553028-c8cbw" Mar 10 22:28:07 crc kubenswrapper[4919]: I0310 22:28:07.291758 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553022-2hnzl"] Mar 10 22:28:07 crc kubenswrapper[4919]: I0310 22:28:07.298602 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553022-2hnzl"] Mar 10 22:28:07 crc kubenswrapper[4919]: I0310 22:28:07.490944 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1049b5a8-2f4b-4797-b484-7a151abab4bb" path="/var/lib/kubelet/pods/1049b5a8-2f4b-4797-b484-7a151abab4bb/volumes" Mar 10 22:28:08 crc kubenswrapper[4919]: I0310 22:28:08.479985 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:28:08 crc kubenswrapper[4919]: E0310 22:28:08.480188 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.517272 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7htmr"] Mar 10 22:28:17 crc kubenswrapper[4919]: E0310 22:28:17.518448 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437d2054-282f-4479-9d27-cb2e855e0a0b" containerName="oc" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.518470 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="437d2054-282f-4479-9d27-cb2e855e0a0b" containerName="oc" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.518747 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="437d2054-282f-4479-9d27-cb2e855e0a0b" containerName="oc" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.520523 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.524247 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7htmr"] Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.663449 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-utilities\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.663504 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-catalog-content\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.663730 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8tq\" (UniqueName: \"kubernetes.io/projected/b8af1404-7303-42c1-81d2-c98c6014addd-kube-api-access-fc8tq\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.764639 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8tq\" (UniqueName: \"kubernetes.io/projected/b8af1404-7303-42c1-81d2-c98c6014addd-kube-api-access-fc8tq\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.764806 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-utilities\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.764841 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-catalog-content\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.765486 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-utilities\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.765524 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-catalog-content\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.792931 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8tq\" (UniqueName: \"kubernetes.io/projected/b8af1404-7303-42c1-81d2-c98c6014addd-kube-api-access-fc8tq\") pod \"certified-operators-7htmr\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:17 crc kubenswrapper[4919]: I0310 22:28:17.841446 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:18 crc kubenswrapper[4919]: I0310 22:28:18.068091 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7htmr"] Mar 10 22:28:19 crc kubenswrapper[4919]: I0310 22:28:19.044915 4919 generic.go:334] "Generic (PLEG): container finished" podID="b8af1404-7303-42c1-81d2-c98c6014addd" containerID="508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec" exitCode=0 Mar 10 22:28:19 crc kubenswrapper[4919]: I0310 22:28:19.045001 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7htmr" event={"ID":"b8af1404-7303-42c1-81d2-c98c6014addd","Type":"ContainerDied","Data":"508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec"} Mar 10 22:28:19 crc kubenswrapper[4919]: I0310 22:28:19.045218 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7htmr" event={"ID":"b8af1404-7303-42c1-81d2-c98c6014addd","Type":"ContainerStarted","Data":"3a5d144222d3148492e39add4e019d975c92c3d6bec69358b4177a69f7473e51"} Mar 10 22:28:20 crc kubenswrapper[4919]: I0310 22:28:20.057067 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7htmr" event={"ID":"b8af1404-7303-42c1-81d2-c98c6014addd","Type":"ContainerStarted","Data":"3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd"} Mar 10 22:28:20 crc kubenswrapper[4919]: I0310 22:28:20.479873 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:28:20 crc kubenswrapper[4919]: E0310 22:28:20.480140 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:28:21 crc kubenswrapper[4919]: I0310 22:28:21.067657 4919 generic.go:334] "Generic (PLEG): container finished" podID="b8af1404-7303-42c1-81d2-c98c6014addd" containerID="3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd" exitCode=0 Mar 10 22:28:21 crc kubenswrapper[4919]: I0310 22:28:21.067717 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7htmr" event={"ID":"b8af1404-7303-42c1-81d2-c98c6014addd","Type":"ContainerDied","Data":"3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd"} Mar 10 22:28:22 crc kubenswrapper[4919]: I0310 22:28:22.076795 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7htmr" event={"ID":"b8af1404-7303-42c1-81d2-c98c6014addd","Type":"ContainerStarted","Data":"ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02"} Mar 10 22:28:22 crc kubenswrapper[4919]: I0310 22:28:22.098602 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7htmr" podStartSLOduration=2.6785220130000003 podStartE2EDuration="5.098581668s" podCreationTimestamp="2026-03-10 22:28:17 +0000 UTC" firstStartedPulling="2026-03-10 22:28:19.048110869 +0000 UTC m=+2286.289991507" lastFinishedPulling="2026-03-10 22:28:21.468170554 +0000 UTC m=+2288.710051162" observedRunningTime="2026-03-10 22:28:22.092209185 +0000 UTC m=+2289.334089793" watchObservedRunningTime="2026-03-10 22:28:22.098581668 +0000 UTC m=+2289.340462276" Mar 10 22:28:27 crc kubenswrapper[4919]: I0310 22:28:27.842577 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:27 crc kubenswrapper[4919]: I0310 22:28:27.842974 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:27 crc kubenswrapper[4919]: I0310 22:28:27.914142 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:28 crc kubenswrapper[4919]: I0310 22:28:28.185416 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:28 crc kubenswrapper[4919]: I0310 22:28:28.238197 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7htmr"] Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.149511 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7htmr" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" containerName="registry-server" containerID="cri-o://ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02" gracePeriod=2 Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.668138 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.863510 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8tq\" (UniqueName: \"kubernetes.io/projected/b8af1404-7303-42c1-81d2-c98c6014addd-kube-api-access-fc8tq\") pod \"b8af1404-7303-42c1-81d2-c98c6014addd\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.863555 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-utilities\") pod \"b8af1404-7303-42c1-81d2-c98c6014addd\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.863681 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-catalog-content\") pod \"b8af1404-7303-42c1-81d2-c98c6014addd\" (UID: \"b8af1404-7303-42c1-81d2-c98c6014addd\") " Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.864496 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-utilities" (OuterVolumeSpecName: "utilities") pod "b8af1404-7303-42c1-81d2-c98c6014addd" (UID: "b8af1404-7303-42c1-81d2-c98c6014addd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.873046 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8af1404-7303-42c1-81d2-c98c6014addd-kube-api-access-fc8tq" (OuterVolumeSpecName: "kube-api-access-fc8tq") pod "b8af1404-7303-42c1-81d2-c98c6014addd" (UID: "b8af1404-7303-42c1-81d2-c98c6014addd"). InnerVolumeSpecName "kube-api-access-fc8tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.936054 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8af1404-7303-42c1-81d2-c98c6014addd" (UID: "b8af1404-7303-42c1-81d2-c98c6014addd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.964784 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.964815 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8tq\" (UniqueName: \"kubernetes.io/projected/b8af1404-7303-42c1-81d2-c98c6014addd-kube-api-access-fc8tq\") on node \"crc\" DevicePath \"\"" Mar 10 22:28:30 crc kubenswrapper[4919]: I0310 22:28:30.964825 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af1404-7303-42c1-81d2-c98c6014addd-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.160134 4919 generic.go:334] "Generic (PLEG): container finished" podID="b8af1404-7303-42c1-81d2-c98c6014addd" containerID="ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02" exitCode=0 Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.160227 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7htmr" event={"ID":"b8af1404-7303-42c1-81d2-c98c6014addd","Type":"ContainerDied","Data":"ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02"} Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.160271 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7htmr" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.160291 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7htmr" event={"ID":"b8af1404-7303-42c1-81d2-c98c6014addd","Type":"ContainerDied","Data":"3a5d144222d3148492e39add4e019d975c92c3d6bec69358b4177a69f7473e51"} Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.160327 4919 scope.go:117] "RemoveContainer" containerID="ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.188581 4919 scope.go:117] "RemoveContainer" containerID="3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.207210 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7htmr"] Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.217217 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7htmr"] Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.227894 4919 scope.go:117] "RemoveContainer" containerID="508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.243705 4919 scope.go:117] "RemoveContainer" containerID="ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02" Mar 10 22:28:31 crc kubenswrapper[4919]: E0310 22:28:31.244186 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02\": container with ID starting with ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02 not found: ID does not exist" containerID="ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.244281 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02"} err="failed to get container status \"ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02\": rpc error: code = NotFound desc = could not find container \"ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02\": container with ID starting with ea4663f00f9f1fe30ece41fef96313c3559a42197de6ab0bc0398e0708cf2a02 not found: ID does not exist" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.244337 4919 scope.go:117] "RemoveContainer" containerID="3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd" Mar 10 22:28:31 crc kubenswrapper[4919]: E0310 22:28:31.245009 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd\": container with ID starting with 3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd not found: ID does not exist" containerID="3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.245061 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd"} err="failed to get container status \"3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd\": rpc error: code = NotFound desc = could not find container \"3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd\": container with ID starting with 3a2bf98b54b72b253b69970b6f1f0f69fbac4d5b89a6a7d371f6473386b0e4fd not found: ID does not exist" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.245091 4919 scope.go:117] "RemoveContainer" containerID="508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec" Mar 10 22:28:31 crc kubenswrapper[4919]: E0310 22:28:31.245510 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec\": container with ID starting with 508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec not found: ID does not exist" containerID="508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.245559 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec"} err="failed to get container status \"508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec\": rpc error: code = NotFound desc = could not find container \"508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec\": container with ID starting with 508afb53daeca415f661e2c4f09f697a32270bdb9165dad884a7bc0e858301ec not found: ID does not exist" Mar 10 22:28:31 crc kubenswrapper[4919]: I0310 22:28:31.490186 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" path="/var/lib/kubelet/pods/b8af1404-7303-42c1-81d2-c98c6014addd/volumes" Mar 10 22:28:34 crc kubenswrapper[4919]: I0310 22:28:34.480633 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:28:34 crc kubenswrapper[4919]: E0310 22:28:34.481508 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:28:43 crc kubenswrapper[4919]: I0310 22:28:43.760237 4919 scope.go:117] "RemoveContainer" containerID="a9bd416f7458468cdb52e97ef6ae631a4dad2182192fd7cba36b15232278973d" Mar 10 22:28:46 crc kubenswrapper[4919]: I0310 22:28:46.479962 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:28:46 crc kubenswrapper[4919]: E0310 22:28:46.480832 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:28:58 crc kubenswrapper[4919]: I0310 22:28:58.479510 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:28:58 crc kubenswrapper[4919]: E0310 22:28:58.480353 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:29:11 crc kubenswrapper[4919]: I0310 22:29:11.480058 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:29:11 crc kubenswrapper[4919]: E0310 22:29:11.480771 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:29:25 crc kubenswrapper[4919]: I0310 22:29:25.479857 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:29:25 crc kubenswrapper[4919]: E0310 22:29:25.480671 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:29:38 crc kubenswrapper[4919]: I0310 22:29:38.481568 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:29:38 crc kubenswrapper[4919]: E0310 22:29:38.482639 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:29:51 crc kubenswrapper[4919]: I0310 22:29:51.479668 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:29:51 crc kubenswrapper[4919]: E0310 22:29:51.480276 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.173065 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553030-hrnl5"] Mar 10 22:30:00 crc kubenswrapper[4919]: E0310 22:30:00.174187 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" containerName="registry-server" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.174211 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" containerName="registry-server" Mar 10 22:30:00 crc kubenswrapper[4919]: E0310 22:30:00.174253 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" containerName="extract-content" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.174266 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" containerName="extract-content" Mar 10 22:30:00 crc kubenswrapper[4919]: E0310 22:30:00.174306 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" containerName="extract-utilities" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.174322 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" containerName="extract-utilities" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.174698 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8af1404-7303-42c1-81d2-c98c6014addd" containerName="registry-server" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.175594 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553030-hrnl5" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.179451 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.179476 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.180082 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.193478 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh"] Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.195328 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.198872 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553030-hrnl5"] Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.231618 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.247650 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.260880 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh"] Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.375479 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h64mb\" (UniqueName: \"kubernetes.io/projected/dd0768ea-0564-4f66-8103-00f1652bab8e-kube-api-access-h64mb\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.375787 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727mn\" (UniqueName: \"kubernetes.io/projected/12b7bc36-4ac2-4055-b132-76116239f777-kube-api-access-727mn\") pod \"auto-csr-approver-29553030-hrnl5\" (UID: \"12b7bc36-4ac2-4055-b132-76116239f777\") " pod="openshift-infra/auto-csr-approver-29553030-hrnl5" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.375836 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd0768ea-0564-4f66-8103-00f1652bab8e-secret-volume\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.375859 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd0768ea-0564-4f66-8103-00f1652bab8e-config-volume\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.477365 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h64mb\" (UniqueName: \"kubernetes.io/projected/dd0768ea-0564-4f66-8103-00f1652bab8e-kube-api-access-h64mb\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.477437 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-727mn\" (UniqueName: \"kubernetes.io/projected/12b7bc36-4ac2-4055-b132-76116239f777-kube-api-access-727mn\") pod \"auto-csr-approver-29553030-hrnl5\" (UID: \"12b7bc36-4ac2-4055-b132-76116239f777\") " pod="openshift-infra/auto-csr-approver-29553030-hrnl5" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.477481 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd0768ea-0564-4f66-8103-00f1652bab8e-secret-volume\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.477505 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd0768ea-0564-4f66-8103-00f1652bab8e-config-volume\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.478337 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd0768ea-0564-4f66-8103-00f1652bab8e-config-volume\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.487721 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd0768ea-0564-4f66-8103-00f1652bab8e-secret-volume\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.505165 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h64mb\" (UniqueName: \"kubernetes.io/projected/dd0768ea-0564-4f66-8103-00f1652bab8e-kube-api-access-h64mb\") pod \"collect-profiles-29553030-z5tsh\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.509240 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-727mn\" (UniqueName: \"kubernetes.io/projected/12b7bc36-4ac2-4055-b132-76116239f777-kube-api-access-727mn\") pod \"auto-csr-approver-29553030-hrnl5\" (UID: \"12b7bc36-4ac2-4055-b132-76116239f777\") " pod="openshift-infra/auto-csr-approver-29553030-hrnl5" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.550256 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553030-hrnl5" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.570737 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:00 crc kubenswrapper[4919]: I0310 22:30:00.967720 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553030-hrnl5"] Mar 10 22:30:01 crc kubenswrapper[4919]: I0310 22:30:01.030104 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh"] Mar 10 22:30:01 crc kubenswrapper[4919]: W0310 22:30:01.035194 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0768ea_0564_4f66_8103_00f1652bab8e.slice/crio-5965459832df75b2c13c96b210ef145acc5529d7464bf6deb5c8879ae7111f24 WatchSource:0}: Error finding container 5965459832df75b2c13c96b210ef145acc5529d7464bf6deb5c8879ae7111f24: Status 404 returned error can't find the container with id 5965459832df75b2c13c96b210ef145acc5529d7464bf6deb5c8879ae7111f24 Mar 10 22:30:01 crc kubenswrapper[4919]: I0310 22:30:01.863593 4919 generic.go:334] "Generic (PLEG): container finished" podID="dd0768ea-0564-4f66-8103-00f1652bab8e" containerID="6dc61187129bc29ae275b3b18ecf2eb7ebc2a48a7b100292f01257a83ba12918" exitCode=0 Mar 10 22:30:01 crc kubenswrapper[4919]: I0310 22:30:01.863695 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" event={"ID":"dd0768ea-0564-4f66-8103-00f1652bab8e","Type":"ContainerDied","Data":"6dc61187129bc29ae275b3b18ecf2eb7ebc2a48a7b100292f01257a83ba12918"} Mar 10 22:30:01 crc kubenswrapper[4919]: I0310 22:30:01.863765 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" event={"ID":"dd0768ea-0564-4f66-8103-00f1652bab8e","Type":"ContainerStarted","Data":"5965459832df75b2c13c96b210ef145acc5529d7464bf6deb5c8879ae7111f24"} Mar 10 22:30:01 crc kubenswrapper[4919]: I0310 22:30:01.864768 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553030-hrnl5" event={"ID":"12b7bc36-4ac2-4055-b132-76116239f777","Type":"ContainerStarted","Data":"b0759c11d2b8f68e2c4be42ce72edd51ec7c27dc073df80d2ea8d99726db3e04"} Mar 10 22:30:02 crc kubenswrapper[4919]: I0310 22:30:02.873358 4919 generic.go:334] "Generic (PLEG): container finished" podID="12b7bc36-4ac2-4055-b132-76116239f777" containerID="deeb0bd5600e73d2a8e2ad09cbdd5543d4fe9ce0f7a7b45ff032d282e72b755b" exitCode=0 Mar 10 22:30:02 crc kubenswrapper[4919]: I0310 22:30:02.873437 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553030-hrnl5" event={"ID":"12b7bc36-4ac2-4055-b132-76116239f777","Type":"ContainerDied","Data":"deeb0bd5600e73d2a8e2ad09cbdd5543d4fe9ce0f7a7b45ff032d282e72b755b"} Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.163434 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.316866 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd0768ea-0564-4f66-8103-00f1652bab8e-config-volume\") pod \"dd0768ea-0564-4f66-8103-00f1652bab8e\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.316996 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h64mb\" (UniqueName: \"kubernetes.io/projected/dd0768ea-0564-4f66-8103-00f1652bab8e-kube-api-access-h64mb\") pod \"dd0768ea-0564-4f66-8103-00f1652bab8e\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.317109 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd0768ea-0564-4f66-8103-00f1652bab8e-secret-volume\") pod \"dd0768ea-0564-4f66-8103-00f1652bab8e\" (UID: \"dd0768ea-0564-4f66-8103-00f1652bab8e\") " Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.318962 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0768ea-0564-4f66-8103-00f1652bab8e-config-volume" (OuterVolumeSpecName: "config-volume") pod "dd0768ea-0564-4f66-8103-00f1652bab8e" (UID: "dd0768ea-0564-4f66-8103-00f1652bab8e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.322814 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0768ea-0564-4f66-8103-00f1652bab8e-kube-api-access-h64mb" (OuterVolumeSpecName: "kube-api-access-h64mb") pod "dd0768ea-0564-4f66-8103-00f1652bab8e" (UID: "dd0768ea-0564-4f66-8103-00f1652bab8e"). InnerVolumeSpecName "kube-api-access-h64mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.323314 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0768ea-0564-4f66-8103-00f1652bab8e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dd0768ea-0564-4f66-8103-00f1652bab8e" (UID: "dd0768ea-0564-4f66-8103-00f1652bab8e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.419010 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd0768ea-0564-4f66-8103-00f1652bab8e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.419051 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h64mb\" (UniqueName: \"kubernetes.io/projected/dd0768ea-0564-4f66-8103-00f1652bab8e-kube-api-access-h64mb\") on node \"crc\" DevicePath \"\"" Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.419090 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd0768ea-0564-4f66-8103-00f1652bab8e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.882449 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.882437 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh" event={"ID":"dd0768ea-0564-4f66-8103-00f1652bab8e","Type":"ContainerDied","Data":"5965459832df75b2c13c96b210ef145acc5529d7464bf6deb5c8879ae7111f24"} Mar 10 22:30:03 crc kubenswrapper[4919]: I0310 22:30:03.882521 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5965459832df75b2c13c96b210ef145acc5529d7464bf6deb5c8879ae7111f24" Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.212406 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553030-hrnl5" Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.249599 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j"] Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.255569 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552985-n8r4j"] Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.341240 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-727mn\" (UniqueName: \"kubernetes.io/projected/12b7bc36-4ac2-4055-b132-76116239f777-kube-api-access-727mn\") pod \"12b7bc36-4ac2-4055-b132-76116239f777\" (UID: \"12b7bc36-4ac2-4055-b132-76116239f777\") " Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.346787 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b7bc36-4ac2-4055-b132-76116239f777-kube-api-access-727mn" (OuterVolumeSpecName: "kube-api-access-727mn") pod "12b7bc36-4ac2-4055-b132-76116239f777" (UID: "12b7bc36-4ac2-4055-b132-76116239f777"). InnerVolumeSpecName "kube-api-access-727mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.442993 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-727mn\" (UniqueName: \"kubernetes.io/projected/12b7bc36-4ac2-4055-b132-76116239f777-kube-api-access-727mn\") on node \"crc\" DevicePath \"\"" Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.900316 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553030-hrnl5" event={"ID":"12b7bc36-4ac2-4055-b132-76116239f777","Type":"ContainerDied","Data":"b0759c11d2b8f68e2c4be42ce72edd51ec7c27dc073df80d2ea8d99726db3e04"} Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.900423 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0759c11d2b8f68e2c4be42ce72edd51ec7c27dc073df80d2ea8d99726db3e04" Mar 10 22:30:04 crc kubenswrapper[4919]: I0310 22:30:04.900429 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553030-hrnl5" Mar 10 22:30:05 crc kubenswrapper[4919]: I0310 22:30:05.260017 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553024-l2tsb"] Mar 10 22:30:05 crc kubenswrapper[4919]: I0310 22:30:05.265986 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553024-l2tsb"] Mar 10 22:30:05 crc kubenswrapper[4919]: I0310 22:30:05.480224 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:30:05 crc kubenswrapper[4919]: E0310 22:30:05.480489 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:30:05 crc kubenswrapper[4919]: I0310 22:30:05.488779 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0840e0fd-6127-44e1-a420-c0a9107ed81a" path="/var/lib/kubelet/pods/0840e0fd-6127-44e1-a420-c0a9107ed81a/volumes" Mar 10 22:30:05 crc kubenswrapper[4919]: I0310 22:30:05.489637 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ff2223-69b6-4b72-9413-fce0c37ae2b2" path="/var/lib/kubelet/pods/55ff2223-69b6-4b72-9413-fce0c37ae2b2/volumes" Mar 10 22:30:18 crc kubenswrapper[4919]: I0310 22:30:18.479864 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:30:18 crc kubenswrapper[4919]: E0310 22:30:18.480783 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:30:29 crc kubenswrapper[4919]: I0310 22:30:29.480742 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:30:29 crc kubenswrapper[4919]: E0310 22:30:29.481601 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:30:42 crc kubenswrapper[4919]: I0310 22:30:42.480048 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:30:42 crc kubenswrapper[4919]: E0310 22:30:42.481080 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:30:43 crc kubenswrapper[4919]: I0310 22:30:43.860234 4919 scope.go:117] "RemoveContainer" containerID="250be72ff0c58f3c17e5f2b0b03e8d7d9c00b768fb62e47d91f28752ab449b81" Mar 10 22:30:43 crc kubenswrapper[4919]: I0310 22:30:43.910977 4919 scope.go:117] "RemoveContainer" containerID="ff6c5d94828d4986d3c2921507606c68566c8b8f1db53d6e9faed67db575e663" Mar 10 22:30:56 crc kubenswrapper[4919]: I0310 22:30:56.480990 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:30:56 crc kubenswrapper[4919]: E0310 22:30:56.482072 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:31:11 crc kubenswrapper[4919]: I0310 22:31:11.480256 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:31:11 crc kubenswrapper[4919]: E0310 22:31:11.480974 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:31:24 crc kubenswrapper[4919]: I0310 22:31:24.480204 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:31:24 crc kubenswrapper[4919]: E0310 22:31:24.481087 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:31:39 crc kubenswrapper[4919]: I0310 22:31:39.480332 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:31:39 crc kubenswrapper[4919]: E0310 22:31:39.481558 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:31:52 crc kubenswrapper[4919]: I0310 22:31:52.479565 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:31:52 crc kubenswrapper[4919]: E0310 22:31:52.480289 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.160668 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553032-7mzkx"] Mar 10 22:32:00 crc kubenswrapper[4919]: E0310 22:32:00.163876 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b7bc36-4ac2-4055-b132-76116239f777" containerName="oc" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.163911 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b7bc36-4ac2-4055-b132-76116239f777" containerName="oc" Mar 10 22:32:00 crc kubenswrapper[4919]: E0310 22:32:00.163928 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0768ea-0564-4f66-8103-00f1652bab8e" containerName="collect-profiles" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.163939 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0768ea-0564-4f66-8103-00f1652bab8e" containerName="collect-profiles" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.164153 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0768ea-0564-4f66-8103-00f1652bab8e" containerName="collect-profiles" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.164174 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b7bc36-4ac2-4055-b132-76116239f777" containerName="oc" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.164940 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553032-7mzkx" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.168167 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.168363 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.169088 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.185358 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553032-7mzkx"] Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.267512 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgq8b\" (UniqueName: \"kubernetes.io/projected/c17effc3-e48e-44e1-8748-5984dec80b50-kube-api-access-lgq8b\") pod \"auto-csr-approver-29553032-7mzkx\" (UID: \"c17effc3-e48e-44e1-8748-5984dec80b50\") " pod="openshift-infra/auto-csr-approver-29553032-7mzkx" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.369295 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgq8b\" (UniqueName: \"kubernetes.io/projected/c17effc3-e48e-44e1-8748-5984dec80b50-kube-api-access-lgq8b\") pod \"auto-csr-approver-29553032-7mzkx\" (UID: \"c17effc3-e48e-44e1-8748-5984dec80b50\") " pod="openshift-infra/auto-csr-approver-29553032-7mzkx" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.396248 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgq8b\" (UniqueName: \"kubernetes.io/projected/c17effc3-e48e-44e1-8748-5984dec80b50-kube-api-access-lgq8b\") pod \"auto-csr-approver-29553032-7mzkx\" (UID: \"c17effc3-e48e-44e1-8748-5984dec80b50\") " pod="openshift-infra/auto-csr-approver-29553032-7mzkx" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.501650 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553032-7mzkx" Mar 10 22:32:00 crc kubenswrapper[4919]: I0310 22:32:00.976086 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553032-7mzkx"] Mar 10 22:32:01 crc kubenswrapper[4919]: I0310 22:32:01.884450 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553032-7mzkx" event={"ID":"c17effc3-e48e-44e1-8748-5984dec80b50","Type":"ContainerStarted","Data":"3716df4989d713dd17c11c45afb0bde50f3ff52f9b519549570024b772ad2dd2"} Mar 10 22:32:02 crc kubenswrapper[4919]: I0310 22:32:02.891803 4919 generic.go:334] "Generic (PLEG): container finished" podID="c17effc3-e48e-44e1-8748-5984dec80b50" containerID="22e7709b9ca95d99df8dc1a6ba4aa234214cb52be1045044ed044568d0b38cfd" exitCode=0 Mar 10 22:32:02 crc kubenswrapper[4919]: I0310 22:32:02.891904 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553032-7mzkx" event={"ID":"c17effc3-e48e-44e1-8748-5984dec80b50","Type":"ContainerDied","Data":"22e7709b9ca95d99df8dc1a6ba4aa234214cb52be1045044ed044568d0b38cfd"} Mar 10 22:32:04 crc kubenswrapper[4919]: I0310 22:32:04.184770 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553032-7mzkx" Mar 10 22:32:04 crc kubenswrapper[4919]: I0310 22:32:04.325571 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgq8b\" (UniqueName: \"kubernetes.io/projected/c17effc3-e48e-44e1-8748-5984dec80b50-kube-api-access-lgq8b\") pod \"c17effc3-e48e-44e1-8748-5984dec80b50\" (UID: \"c17effc3-e48e-44e1-8748-5984dec80b50\") " Mar 10 22:32:04 crc kubenswrapper[4919]: I0310 22:32:04.336647 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17effc3-e48e-44e1-8748-5984dec80b50-kube-api-access-lgq8b" (OuterVolumeSpecName: "kube-api-access-lgq8b") pod "c17effc3-e48e-44e1-8748-5984dec80b50" (UID: "c17effc3-e48e-44e1-8748-5984dec80b50"). InnerVolumeSpecName "kube-api-access-lgq8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:32:04 crc kubenswrapper[4919]: I0310 22:32:04.427144 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgq8b\" (UniqueName: \"kubernetes.io/projected/c17effc3-e48e-44e1-8748-5984dec80b50-kube-api-access-lgq8b\") on node \"crc\" DevicePath \"\"" Mar 10 22:32:04 crc kubenswrapper[4919]: I0310 22:32:04.909170 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553032-7mzkx" event={"ID":"c17effc3-e48e-44e1-8748-5984dec80b50","Type":"ContainerDied","Data":"3716df4989d713dd17c11c45afb0bde50f3ff52f9b519549570024b772ad2dd2"} Mar 10 22:32:04 crc kubenswrapper[4919]: I0310 22:32:04.909205 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3716df4989d713dd17c11c45afb0bde50f3ff52f9b519549570024b772ad2dd2" Mar 10 22:32:04 crc kubenswrapper[4919]: I0310 22:32:04.909263 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553032-7mzkx" Mar 10 22:32:05 crc kubenswrapper[4919]: I0310 22:32:05.252139 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553026-w49lv"] Mar 10 22:32:05 crc kubenswrapper[4919]: I0310 22:32:05.257719 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553026-w49lv"] Mar 10 22:32:05 crc kubenswrapper[4919]: I0310 22:32:05.488967 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071fb005-dfb2-4126-930a-483536734b4d" path="/var/lib/kubelet/pods/071fb005-dfb2-4126-930a-483536734b4d/volumes" Mar 10 22:32:06 crc kubenswrapper[4919]: I0310 22:32:06.480595 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:32:06 crc kubenswrapper[4919]: E0310 22:32:06.481037 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:32:20 crc kubenswrapper[4919]: I0310 22:32:20.480308 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:32:20 crc kubenswrapper[4919]: E0310 22:32:20.481081 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:32:32 crc kubenswrapper[4919]: I0310 22:32:32.480170 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:32:33 crc kubenswrapper[4919]: I0310 22:32:33.143556 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"9094d88f1820c509ccc23fbba2a6de71b36165903755515eede39eff8e281d36"} Mar 10 22:32:43 crc kubenswrapper[4919]: I0310 22:32:43.996544 4919 scope.go:117] "RemoveContainer" containerID="b457d73d0c8fb90b5df667a6165bfeef8a9e25f1501b5e40b64f4ccf32992080" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.172319 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553034-l6slc"] Mar 10 22:34:00 crc kubenswrapper[4919]: E0310 22:34:00.175646 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17effc3-e48e-44e1-8748-5984dec80b50" containerName="oc" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.175684 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17effc3-e48e-44e1-8748-5984dec80b50" containerName="oc" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.175945 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17effc3-e48e-44e1-8748-5984dec80b50" containerName="oc" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.176856 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553034-l6slc" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.180844 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.180941 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.181183 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.184296 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553034-l6slc"] Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.193777 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg2dc\" (UniqueName: \"kubernetes.io/projected/e2821ca8-8e4c-408e-a6ab-81206c355afb-kube-api-access-hg2dc\") pod \"auto-csr-approver-29553034-l6slc\" (UID: \"e2821ca8-8e4c-408e-a6ab-81206c355afb\") " pod="openshift-infra/auto-csr-approver-29553034-l6slc" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.295196 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg2dc\" (UniqueName: \"kubernetes.io/projected/e2821ca8-8e4c-408e-a6ab-81206c355afb-kube-api-access-hg2dc\") pod \"auto-csr-approver-29553034-l6slc\" (UID: \"e2821ca8-8e4c-408e-a6ab-81206c355afb\") " pod="openshift-infra/auto-csr-approver-29553034-l6slc" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.328798 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg2dc\" (UniqueName: \"kubernetes.io/projected/e2821ca8-8e4c-408e-a6ab-81206c355afb-kube-api-access-hg2dc\") pod \"auto-csr-approver-29553034-l6slc\" (UID: \"e2821ca8-8e4c-408e-a6ab-81206c355afb\") " pod="openshift-infra/auto-csr-approver-29553034-l6slc" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.504293 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553034-l6slc" Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.986927 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553034-l6slc"] Mar 10 22:34:00 crc kubenswrapper[4919]: W0310 22:34:00.993678 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2821ca8_8e4c_408e_a6ab_81206c355afb.slice/crio-9f9cb4804914d1a4cd51baa568568922b0e750964b0f783d5ffdd510c3672a88 WatchSource:0}: Error finding container 9f9cb4804914d1a4cd51baa568568922b0e750964b0f783d5ffdd510c3672a88: Status 404 returned error can't find the container with id 9f9cb4804914d1a4cd51baa568568922b0e750964b0f783d5ffdd510c3672a88 Mar 10 22:34:00 crc kubenswrapper[4919]: I0310 22:34:00.999572 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:34:01 crc kubenswrapper[4919]: I0310 22:34:01.943485 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553034-l6slc" event={"ID":"e2821ca8-8e4c-408e-a6ab-81206c355afb","Type":"ContainerStarted","Data":"9f9cb4804914d1a4cd51baa568568922b0e750964b0f783d5ffdd510c3672a88"} Mar 10 22:34:02 crc kubenswrapper[4919]: I0310 22:34:02.950769 4919 generic.go:334] "Generic (PLEG): container finished" podID="e2821ca8-8e4c-408e-a6ab-81206c355afb" containerID="67064bf0f66214658f8b8f245f2943060e5402c2e43126363b040cfce267001e" exitCode=0 Mar 10 22:34:02 crc kubenswrapper[4919]: I0310 22:34:02.950825 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553034-l6slc" event={"ID":"e2821ca8-8e4c-408e-a6ab-81206c355afb","Type":"ContainerDied","Data":"67064bf0f66214658f8b8f245f2943060e5402c2e43126363b040cfce267001e"} Mar 10 22:34:04 crc kubenswrapper[4919]: I0310 22:34:04.239368 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553034-l6slc" Mar 10 22:34:04 crc kubenswrapper[4919]: I0310 22:34:04.248774 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg2dc\" (UniqueName: \"kubernetes.io/projected/e2821ca8-8e4c-408e-a6ab-81206c355afb-kube-api-access-hg2dc\") pod \"e2821ca8-8e4c-408e-a6ab-81206c355afb\" (UID: \"e2821ca8-8e4c-408e-a6ab-81206c355afb\") " Mar 10 22:34:04 crc kubenswrapper[4919]: I0310 22:34:04.259369 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2821ca8-8e4c-408e-a6ab-81206c355afb-kube-api-access-hg2dc" (OuterVolumeSpecName: "kube-api-access-hg2dc") pod "e2821ca8-8e4c-408e-a6ab-81206c355afb" (UID: "e2821ca8-8e4c-408e-a6ab-81206c355afb"). InnerVolumeSpecName "kube-api-access-hg2dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:34:04 crc kubenswrapper[4919]: I0310 22:34:04.350852 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg2dc\" (UniqueName: \"kubernetes.io/projected/e2821ca8-8e4c-408e-a6ab-81206c355afb-kube-api-access-hg2dc\") on node \"crc\" DevicePath \"\"" Mar 10 22:34:04 crc kubenswrapper[4919]: I0310 22:34:04.970212 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553034-l6slc" event={"ID":"e2821ca8-8e4c-408e-a6ab-81206c355afb","Type":"ContainerDied","Data":"9f9cb4804914d1a4cd51baa568568922b0e750964b0f783d5ffdd510c3672a88"} Mar 10 22:34:04 crc kubenswrapper[4919]: I0310 22:34:04.970254 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f9cb4804914d1a4cd51baa568568922b0e750964b0f783d5ffdd510c3672a88" Mar 10 22:34:04 crc kubenswrapper[4919]: I0310 22:34:04.970273 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553034-l6slc" Mar 10 22:34:05 crc kubenswrapper[4919]: I0310 22:34:05.327459 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553028-c8cbw"] Mar 10 22:34:05 crc kubenswrapper[4919]: I0310 22:34:05.338972 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553028-c8cbw"] Mar 10 22:34:05 crc kubenswrapper[4919]: I0310 22:34:05.488539 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437d2054-282f-4479-9d27-cb2e855e0a0b" path="/var/lib/kubelet/pods/437d2054-282f-4479-9d27-cb2e855e0a0b/volumes" Mar 10 22:34:44 crc kubenswrapper[4919]: I0310 22:34:44.090670 4919 scope.go:117] "RemoveContainer" containerID="e107ebb513fc3028216e4766060a3facd4b18f5c500d45945c0eefef1126aaa0" Mar 10 22:34:59 crc kubenswrapper[4919]: I0310 22:34:59.175975 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:34:59 crc kubenswrapper[4919]: I0310 22:34:59.176570 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:35:29 crc kubenswrapper[4919]: I0310 22:35:29.175780 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:35:29 crc kubenswrapper[4919]: I0310 22:35:29.176232 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:35:47 crc kubenswrapper[4919]: I0310 22:35:47.853833 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-27gvj"] Mar 10 22:35:47 crc kubenswrapper[4919]: E0310 22:35:47.855297 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2821ca8-8e4c-408e-a6ab-81206c355afb" containerName="oc" Mar 10 22:35:47 crc kubenswrapper[4919]: I0310 22:35:47.855336 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2821ca8-8e4c-408e-a6ab-81206c355afb" containerName="oc" Mar 10 22:35:47 crc kubenswrapper[4919]: I0310 22:35:47.855735 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2821ca8-8e4c-408e-a6ab-81206c355afb" containerName="oc" Mar 10 22:35:47 crc kubenswrapper[4919]: I0310 22:35:47.861196 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:47 crc kubenswrapper[4919]: I0310 22:35:47.881784 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-27gvj"] Mar 10 22:35:47 crc kubenswrapper[4919]: I0310 22:35:47.980835 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-utilities\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:47 crc kubenswrapper[4919]: I0310 22:35:47.980891 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt259\" (UniqueName: \"kubernetes.io/projected/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-kube-api-access-tt259\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:47 crc kubenswrapper[4919]: I0310 22:35:47.980916 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-catalog-content\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.082024 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-utilities\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.082291 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt259\" (UniqueName: \"kubernetes.io/projected/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-kube-api-access-tt259\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.082417 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-catalog-content\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.082670 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-utilities\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.082993 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-catalog-content\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.106465 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt259\" (UniqueName: \"kubernetes.io/projected/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-kube-api-access-tt259\") pod \"redhat-operators-27gvj\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.184220 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.439993 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-27gvj"] Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.823573 4919 generic.go:334] "Generic (PLEG): container finished" podID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerID="bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa" exitCode=0 Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.823709 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27gvj" event={"ID":"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42","Type":"ContainerDied","Data":"bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa"} Mar 10 22:35:48 crc kubenswrapper[4919]: I0310 22:35:48.824086 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27gvj" event={"ID":"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42","Type":"ContainerStarted","Data":"060e16fcd3b8be59dc54b2935d542b1a17095dc3b7509c911db095112d55dd6c"} Mar 10 22:35:49 crc kubenswrapper[4919]: I0310 22:35:49.831344 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27gvj" event={"ID":"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42","Type":"ContainerStarted","Data":"680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e"} Mar 10 22:35:50 crc kubenswrapper[4919]: I0310 22:35:50.842194 4919 generic.go:334] "Generic (PLEG): container finished" podID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerID="680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e" exitCode=0 Mar 10 22:35:50 crc kubenswrapper[4919]: I0310 22:35:50.842479 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27gvj" event={"ID":"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42","Type":"ContainerDied","Data":"680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e"} Mar 10 22:35:51 crc kubenswrapper[4919]: I0310 22:35:51.851727 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27gvj" event={"ID":"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42","Type":"ContainerStarted","Data":"70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed"} Mar 10 22:35:51 crc kubenswrapper[4919]: I0310 22:35:51.879208 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-27gvj" podStartSLOduration=2.43484008 podStartE2EDuration="4.879186098s" podCreationTimestamp="2026-03-10 22:35:47 +0000 UTC" firstStartedPulling="2026-03-10 22:35:48.825183573 +0000 UTC m=+2736.067064181" lastFinishedPulling="2026-03-10 22:35:51.269529551 +0000 UTC m=+2738.511410199" observedRunningTime="2026-03-10 22:35:51.871332946 +0000 UTC m=+2739.113213584" watchObservedRunningTime="2026-03-10 22:35:51.879186098 +0000 UTC m=+2739.121066716" Mar 10 22:35:58 crc kubenswrapper[4919]: I0310 22:35:58.185033 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:58 crc kubenswrapper[4919]: I0310 22:35:58.185334 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:58 crc kubenswrapper[4919]: I0310 22:35:58.225754 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:58 crc kubenswrapper[4919]: I0310 22:35:58.966547 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.028249 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-27gvj"] Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.175897 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.176009 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.176063 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.176798 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9094d88f1820c509ccc23fbba2a6de71b36165903755515eede39eff8e281d36"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.176889 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://9094d88f1820c509ccc23fbba2a6de71b36165903755515eede39eff8e281d36" gracePeriod=600 Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.925849 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="9094d88f1820c509ccc23fbba2a6de71b36165903755515eede39eff8e281d36" exitCode=0 Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.925900 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"9094d88f1820c509ccc23fbba2a6de71b36165903755515eede39eff8e281d36"} Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.926334 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94"} Mar 10 22:35:59 crc kubenswrapper[4919]: I0310 22:35:59.926386 4919 scope.go:117] "RemoveContainer" containerID="6bf3588128e568c16cb871e80818127ef0aaa14fe7758988393e9de44935b23b" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.155291 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553036-s9zgb"] Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.157525 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553036-s9zgb" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.169383 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553036-s9zgb"] Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.202496 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.202752 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.203464 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.262226 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k9zf\" (UniqueName: \"kubernetes.io/projected/77c6040b-e446-44b1-8ef8-05c4987a9371-kube-api-access-9k9zf\") pod \"auto-csr-approver-29553036-s9zgb\" (UID: \"77c6040b-e446-44b1-8ef8-05c4987a9371\") " pod="openshift-infra/auto-csr-approver-29553036-s9zgb" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.363433 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k9zf\" (UniqueName: \"kubernetes.io/projected/77c6040b-e446-44b1-8ef8-05c4987a9371-kube-api-access-9k9zf\") pod \"auto-csr-approver-29553036-s9zgb\" (UID: \"77c6040b-e446-44b1-8ef8-05c4987a9371\") " pod="openshift-infra/auto-csr-approver-29553036-s9zgb" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.391424 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k9zf\" (UniqueName: \"kubernetes.io/projected/77c6040b-e446-44b1-8ef8-05c4987a9371-kube-api-access-9k9zf\") pod \"auto-csr-approver-29553036-s9zgb\" (UID: \"77c6040b-e446-44b1-8ef8-05c4987a9371\") " pod="openshift-infra/auto-csr-approver-29553036-s9zgb" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.527314 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553036-s9zgb" Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.792838 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553036-s9zgb"] Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.932861 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553036-s9zgb" event={"ID":"77c6040b-e446-44b1-8ef8-05c4987a9371","Type":"ContainerStarted","Data":"c944b8dc8e4eb7a7f1c27687797fd25c848c1006bb25404744cc3893664ed9ef"} Mar 10 22:36:00 crc kubenswrapper[4919]: I0310 22:36:00.935089 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-27gvj" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerName="registry-server" containerID="cri-o://70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed" gracePeriod=2 Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.560488 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.595067 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt259\" (UniqueName: \"kubernetes.io/projected/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-kube-api-access-tt259\") pod \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.595134 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-catalog-content\") pod \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.595462 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-utilities\") pod \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\" (UID: \"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42\") " Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.596702 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-utilities" (OuterVolumeSpecName: "utilities") pod "2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" (UID: "2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.608599 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-kube-api-access-tt259" (OuterVolumeSpecName: "kube-api-access-tt259") pod "2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" (UID: "2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42"). InnerVolumeSpecName "kube-api-access-tt259". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.696776 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.696805 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt259\" (UniqueName: \"kubernetes.io/projected/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-kube-api-access-tt259\") on node \"crc\" DevicePath \"\"" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.737790 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" (UID: "2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.798331 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.949291 4919 generic.go:334] "Generic (PLEG): container finished" podID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerID="70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed" exitCode=0 Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.949419 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-27gvj" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.949770 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27gvj" event={"ID":"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42","Type":"ContainerDied","Data":"70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed"} Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.949844 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-27gvj" event={"ID":"2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42","Type":"ContainerDied","Data":"060e16fcd3b8be59dc54b2935d542b1a17095dc3b7509c911db095112d55dd6c"} Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.949874 4919 scope.go:117] "RemoveContainer" containerID="70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed" Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.951483 4919 generic.go:334] "Generic (PLEG): container finished" podID="77c6040b-e446-44b1-8ef8-05c4987a9371" containerID="4e02ec22750d4cf8cbcdc628e33726d4ef1f4c242fb06990dee53c9b5327e6fb" exitCode=0 Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.951504 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553036-s9zgb" event={"ID":"77c6040b-e446-44b1-8ef8-05c4987a9371","Type":"ContainerDied","Data":"4e02ec22750d4cf8cbcdc628e33726d4ef1f4c242fb06990dee53c9b5327e6fb"} Mar 10 22:36:02 crc kubenswrapper[4919]: I0310 22:36:02.973489 4919 scope.go:117] "RemoveContainer" containerID="680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e" Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.003589 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-27gvj"] Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.012318 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-27gvj"] Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.012708 4919 scope.go:117] "RemoveContainer" containerID="bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa" Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.027093 4919 scope.go:117] "RemoveContainer" containerID="70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed" Mar 10 22:36:03 crc kubenswrapper[4919]: E0310 22:36:03.027564 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed\": container with ID starting with 70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed not found: ID does not exist" containerID="70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed" Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.027602 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed"} err="failed to get container status \"70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed\": rpc error: code = NotFound desc = could not find container \"70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed\": container with ID starting with 70be3d3d9a075985fb7c6f7c272d47c58ec5d23d368fc0f1ca822afa579a9aed not found: ID does not exist" Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.027627 4919 scope.go:117] "RemoveContainer" containerID="680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e" Mar 10 22:36:03 crc kubenswrapper[4919]: E0310 22:36:03.027991 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e\": container with ID starting with 680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e not found: ID does not exist" containerID="680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e" Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.028043 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e"} err="failed to get container status \"680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e\": rpc error: code = NotFound desc = could not find container \"680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e\": container with ID starting with 680e84f0014a878c0fa6f725751b633a8f8df7c807393bb9b44524238b1fe50e not found: ID does not exist" Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.028088 4919 scope.go:117] "RemoveContainer" containerID="bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa" Mar 10 22:36:03 crc kubenswrapper[4919]: E0310 22:36:03.028460 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa\": container with ID starting with bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa not found: ID does not exist" containerID="bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa" Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.028496 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa"} err="failed to get container status \"bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa\": rpc error: code = NotFound desc = could not find container \"bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa\": container with ID starting with bc20158b9a8f0cb7097488b7543526223a93800d8622d8d312169c08d7f252aa not found: ID does not exist" Mar 10 22:36:03 crc kubenswrapper[4919]: I0310 22:36:03.498963 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" path="/var/lib/kubelet/pods/2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42/volumes" Mar 10 22:36:04 crc kubenswrapper[4919]: I0310 22:36:04.248960 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553036-s9zgb" Mar 10 22:36:04 crc kubenswrapper[4919]: I0310 22:36:04.418270 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k9zf\" (UniqueName: \"kubernetes.io/projected/77c6040b-e446-44b1-8ef8-05c4987a9371-kube-api-access-9k9zf\") pod \"77c6040b-e446-44b1-8ef8-05c4987a9371\" (UID: \"77c6040b-e446-44b1-8ef8-05c4987a9371\") " Mar 10 22:36:04 crc kubenswrapper[4919]: I0310 22:36:04.423606 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c6040b-e446-44b1-8ef8-05c4987a9371-kube-api-access-9k9zf" (OuterVolumeSpecName: "kube-api-access-9k9zf") pod "77c6040b-e446-44b1-8ef8-05c4987a9371" (UID: "77c6040b-e446-44b1-8ef8-05c4987a9371"). InnerVolumeSpecName "kube-api-access-9k9zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:36:04 crc kubenswrapper[4919]: I0310 22:36:04.519869 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k9zf\" (UniqueName: \"kubernetes.io/projected/77c6040b-e446-44b1-8ef8-05c4987a9371-kube-api-access-9k9zf\") on node \"crc\" DevicePath \"\"" Mar 10 22:36:04 crc kubenswrapper[4919]: I0310 22:36:04.971293 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553036-s9zgb" event={"ID":"77c6040b-e446-44b1-8ef8-05c4987a9371","Type":"ContainerDied","Data":"c944b8dc8e4eb7a7f1c27687797fd25c848c1006bb25404744cc3893664ed9ef"} Mar 10 22:36:04 crc kubenswrapper[4919]: I0310 22:36:04.971699 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c944b8dc8e4eb7a7f1c27687797fd25c848c1006bb25404744cc3893664ed9ef" Mar 10 22:36:04 crc kubenswrapper[4919]: I0310 22:36:04.971354 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553036-s9zgb" Mar 10 22:36:05 crc kubenswrapper[4919]: I0310 22:36:05.313981 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553030-hrnl5"] Mar 10 22:36:05 crc kubenswrapper[4919]: I0310 22:36:05.322423 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553030-hrnl5"] Mar 10 22:36:05 crc kubenswrapper[4919]: I0310 22:36:05.492729 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b7bc36-4ac2-4055-b132-76116239f777" path="/var/lib/kubelet/pods/12b7bc36-4ac2-4055-b132-76116239f777/volumes" Mar 10 22:36:44 crc kubenswrapper[4919]: I0310 22:36:44.221998 4919 scope.go:117] "RemoveContainer" containerID="deeb0bd5600e73d2a8e2ad09cbdd5543d4fe9ce0f7a7b45ff032d282e72b755b" Mar 10 22:37:59 crc kubenswrapper[4919]: I0310 22:37:59.175666 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:37:59 crc kubenswrapper[4919]: I0310 22:37:59.176453 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.173760 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553038-vlnw9"] Mar 10 22:38:00 crc kubenswrapper[4919]: E0310 22:38:00.175100 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerName="extract-utilities" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.175317 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerName="extract-utilities" Mar 10 22:38:00 crc kubenswrapper[4919]: E0310 22:38:00.175581 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c6040b-e446-44b1-8ef8-05c4987a9371" containerName="oc" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.175766 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c6040b-e446-44b1-8ef8-05c4987a9371" containerName="oc" Mar 10 22:38:00 crc kubenswrapper[4919]: E0310 22:38:00.175965 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerName="registry-server" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.176187 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerName="registry-server" Mar 10 22:38:00 crc kubenswrapper[4919]: E0310 22:38:00.176434 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerName="extract-content" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.177703 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerName="extract-content" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.178202 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c6040b-e446-44b1-8ef8-05c4987a9371" containerName="oc" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.178464 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb660a7-e8d0-40d7-82b0-57a0aeb8bc42" containerName="registry-server" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.179629 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553038-vlnw9" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.182412 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.182380 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.182699 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.183985 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553038-vlnw9"] Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.288142 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zddg\" (UniqueName: \"kubernetes.io/projected/329ba3ec-70e3-4c27-b210-43be1d6594e4-kube-api-access-7zddg\") pod \"auto-csr-approver-29553038-vlnw9\" (UID: \"329ba3ec-70e3-4c27-b210-43be1d6594e4\") " pod="openshift-infra/auto-csr-approver-29553038-vlnw9" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.389919 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zddg\" (UniqueName: \"kubernetes.io/projected/329ba3ec-70e3-4c27-b210-43be1d6594e4-kube-api-access-7zddg\") pod \"auto-csr-approver-29553038-vlnw9\" (UID: \"329ba3ec-70e3-4c27-b210-43be1d6594e4\") " pod="openshift-infra/auto-csr-approver-29553038-vlnw9" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.415564 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zddg\" (UniqueName: \"kubernetes.io/projected/329ba3ec-70e3-4c27-b210-43be1d6594e4-kube-api-access-7zddg\") pod \"auto-csr-approver-29553038-vlnw9\" (UID: \"329ba3ec-70e3-4c27-b210-43be1d6594e4\") " pod="openshift-infra/auto-csr-approver-29553038-vlnw9" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.507687 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553038-vlnw9" Mar 10 22:38:00 crc kubenswrapper[4919]: I0310 22:38:00.972816 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553038-vlnw9"] Mar 10 22:38:00 crc kubenswrapper[4919]: W0310 22:38:00.976942 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329ba3ec_70e3_4c27_b210_43be1d6594e4.slice/crio-d39c320cfd281b93fc9ae22c7c2d344319b50c1ace79fabddb522fa963ad5c0d WatchSource:0}: Error finding container d39c320cfd281b93fc9ae22c7c2d344319b50c1ace79fabddb522fa963ad5c0d: Status 404 returned error can't find the container with id d39c320cfd281b93fc9ae22c7c2d344319b50c1ace79fabddb522fa963ad5c0d Mar 10 22:38:01 crc kubenswrapper[4919]: I0310 22:38:01.955854 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553038-vlnw9" event={"ID":"329ba3ec-70e3-4c27-b210-43be1d6594e4","Type":"ContainerStarted","Data":"d39c320cfd281b93fc9ae22c7c2d344319b50c1ace79fabddb522fa963ad5c0d"} Mar 10 22:38:02 crc kubenswrapper[4919]: I0310 22:38:02.965032 4919 generic.go:334] "Generic (PLEG): container finished" podID="329ba3ec-70e3-4c27-b210-43be1d6594e4" containerID="a867dc8d5248f236679e1bdcb4654a3e59458a158c24367f6b4ed55a071345d9" exitCode=0 Mar 10 22:38:02 crc kubenswrapper[4919]: I0310 22:38:02.965088 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553038-vlnw9" event={"ID":"329ba3ec-70e3-4c27-b210-43be1d6594e4","Type":"ContainerDied","Data":"a867dc8d5248f236679e1bdcb4654a3e59458a158c24367f6b4ed55a071345d9"} Mar 10 22:38:04 crc kubenswrapper[4919]: I0310 22:38:04.230302 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553038-vlnw9" Mar 10 22:38:04 crc kubenswrapper[4919]: I0310 22:38:04.344755 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zddg\" (UniqueName: \"kubernetes.io/projected/329ba3ec-70e3-4c27-b210-43be1d6594e4-kube-api-access-7zddg\") pod \"329ba3ec-70e3-4c27-b210-43be1d6594e4\" (UID: \"329ba3ec-70e3-4c27-b210-43be1d6594e4\") " Mar 10 22:38:04 crc kubenswrapper[4919]: I0310 22:38:04.350047 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329ba3ec-70e3-4c27-b210-43be1d6594e4-kube-api-access-7zddg" (OuterVolumeSpecName: "kube-api-access-7zddg") pod "329ba3ec-70e3-4c27-b210-43be1d6594e4" (UID: "329ba3ec-70e3-4c27-b210-43be1d6594e4"). InnerVolumeSpecName "kube-api-access-7zddg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:38:04 crc kubenswrapper[4919]: I0310 22:38:04.446114 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zddg\" (UniqueName: \"kubernetes.io/projected/329ba3ec-70e3-4c27-b210-43be1d6594e4-kube-api-access-7zddg\") on node \"crc\" DevicePath \"\"" Mar 10 22:38:04 crc kubenswrapper[4919]: I0310 22:38:04.979425 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553038-vlnw9" event={"ID":"329ba3ec-70e3-4c27-b210-43be1d6594e4","Type":"ContainerDied","Data":"d39c320cfd281b93fc9ae22c7c2d344319b50c1ace79fabddb522fa963ad5c0d"} Mar 10 22:38:04 crc kubenswrapper[4919]: I0310 22:38:04.979456 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553038-vlnw9" Mar 10 22:38:04 crc kubenswrapper[4919]: I0310 22:38:04.979475 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d39c320cfd281b93fc9ae22c7c2d344319b50c1ace79fabddb522fa963ad5c0d" Mar 10 22:38:05 crc kubenswrapper[4919]: I0310 22:38:05.310020 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553032-7mzkx"] Mar 10 22:38:05 crc kubenswrapper[4919]: I0310 22:38:05.315870 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553032-7mzkx"] Mar 10 22:38:05 crc kubenswrapper[4919]: I0310 22:38:05.491564 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c17effc3-e48e-44e1-8748-5984dec80b50" path="/var/lib/kubelet/pods/c17effc3-e48e-44e1-8748-5984dec80b50/volumes" Mar 10 22:38:29 crc kubenswrapper[4919]: I0310 22:38:29.175640 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:38:29 crc kubenswrapper[4919]: I0310 22:38:29.176557 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:38:44 crc kubenswrapper[4919]: I0310 22:38:44.355420 4919 scope.go:117] "RemoveContainer" containerID="22e7709b9ca95d99df8dc1a6ba4aa234214cb52be1045044ed044568d0b38cfd" Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.175607 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.176200 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.176266 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.177187 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.177287 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" gracePeriod=600 Mar 10 22:38:59 crc kubenswrapper[4919]: E0310 22:38:59.304704 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.420123 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" exitCode=0 Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.420195 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94"} Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.420308 4919 scope.go:117] "RemoveContainer" containerID="9094d88f1820c509ccc23fbba2a6de71b36165903755515eede39eff8e281d36" Mar 10 22:38:59 crc kubenswrapper[4919]: I0310 22:38:59.420862 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:38:59 crc kubenswrapper[4919]: E0310 22:38:59.421090 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.686707 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mw4c5"] Mar 10 22:39:08 crc kubenswrapper[4919]: E0310 22:39:08.687971 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329ba3ec-70e3-4c27-b210-43be1d6594e4" containerName="oc" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.687999 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="329ba3ec-70e3-4c27-b210-43be1d6594e4" containerName="oc" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.688347 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="329ba3ec-70e3-4c27-b210-43be1d6594e4" containerName="oc" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.694338 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.700450 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mw4c5"] Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.845001 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-utilities\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.845079 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t92jg\" (UniqueName: \"kubernetes.io/projected/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-kube-api-access-t92jg\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.845117 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-catalog-content\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.946405 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-catalog-content\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.946560 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-utilities\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.946603 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t92jg\" (UniqueName: \"kubernetes.io/projected/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-kube-api-access-t92jg\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.946944 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-catalog-content\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.947535 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-utilities\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:08 crc kubenswrapper[4919]: I0310 22:39:08.971692 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t92jg\" (UniqueName: \"kubernetes.io/projected/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-kube-api-access-t92jg\") pod \"certified-operators-mw4c5\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:09 crc kubenswrapper[4919]: I0310 22:39:09.029569 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:09 crc kubenswrapper[4919]: I0310 22:39:09.284252 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mw4c5"] Mar 10 22:39:09 crc kubenswrapper[4919]: I0310 22:39:09.512983 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw4c5" event={"ID":"78c8a45f-ef42-4aa1-926c-28b52fdf34c7","Type":"ContainerStarted","Data":"46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef"} Mar 10 22:39:09 crc kubenswrapper[4919]: I0310 22:39:09.513380 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw4c5" event={"ID":"78c8a45f-ef42-4aa1-926c-28b52fdf34c7","Type":"ContainerStarted","Data":"8574c4cdc8071e47e78a9a59c74e65de2590d9f1de199fc20d862ab8d1638efb"} Mar 10 22:39:10 crc kubenswrapper[4919]: I0310 22:39:10.522738 4919 generic.go:334] "Generic (PLEG): container finished" podID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerID="46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef" exitCode=0 Mar 10 22:39:10 crc kubenswrapper[4919]: I0310 22:39:10.522793 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw4c5" event={"ID":"78c8a45f-ef42-4aa1-926c-28b52fdf34c7","Type":"ContainerDied","Data":"46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef"} Mar 10 22:39:10 crc kubenswrapper[4919]: I0310 22:39:10.524249 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:39:11 crc kubenswrapper[4919]: I0310 22:39:11.532973 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw4c5" event={"ID":"78c8a45f-ef42-4aa1-926c-28b52fdf34c7","Type":"ContainerStarted","Data":"8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8"} Mar 10 22:39:12 crc kubenswrapper[4919]: I0310 22:39:12.479846 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:39:12 crc kubenswrapper[4919]: E0310 22:39:12.480166 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:39:12 crc kubenswrapper[4919]: I0310 22:39:12.542328 4919 generic.go:334] "Generic (PLEG): container finished" podID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerID="8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8" exitCode=0 Mar 10 22:39:12 crc kubenswrapper[4919]: I0310 22:39:12.542431 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw4c5" event={"ID":"78c8a45f-ef42-4aa1-926c-28b52fdf34c7","Type":"ContainerDied","Data":"8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8"} Mar 10 22:39:13 crc kubenswrapper[4919]: I0310 22:39:13.554490 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw4c5" event={"ID":"78c8a45f-ef42-4aa1-926c-28b52fdf34c7","Type":"ContainerStarted","Data":"fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32"} Mar 10 22:39:13 crc kubenswrapper[4919]: I0310 22:39:13.577235 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mw4c5" podStartSLOduration=3.111416632 podStartE2EDuration="5.577194862s" podCreationTimestamp="2026-03-10 22:39:08 +0000 UTC" firstStartedPulling="2026-03-10 22:39:10.524051279 +0000 UTC m=+2937.765931887" lastFinishedPulling="2026-03-10 22:39:12.989829489 +0000 UTC m=+2940.231710117" observedRunningTime="2026-03-10 22:39:13.573901261 +0000 UTC m=+2940.815781879" watchObservedRunningTime="2026-03-10 22:39:13.577194862 +0000 UTC m=+2940.819075480" Mar 10 22:39:19 crc kubenswrapper[4919]: I0310 22:39:19.029720 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:19 crc kubenswrapper[4919]: I0310 22:39:19.030006 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:19 crc kubenswrapper[4919]: I0310 22:39:19.082897 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:19 crc kubenswrapper[4919]: I0310 22:39:19.677600 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:19 crc kubenswrapper[4919]: I0310 22:39:19.736871 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mw4c5"] Mar 10 22:39:21 crc kubenswrapper[4919]: I0310 22:39:21.623121 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mw4c5" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerName="registry-server" containerID="cri-o://fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32" gracePeriod=2 Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.087340 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.255734 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-catalog-content\") pod \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.255887 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t92jg\" (UniqueName: \"kubernetes.io/projected/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-kube-api-access-t92jg\") pod \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.255915 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-utilities\") pod \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\" (UID: \"78c8a45f-ef42-4aa1-926c-28b52fdf34c7\") " Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.257080 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-utilities" (OuterVolumeSpecName: "utilities") pod "78c8a45f-ef42-4aa1-926c-28b52fdf34c7" (UID: "78c8a45f-ef42-4aa1-926c-28b52fdf34c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.261846 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-kube-api-access-t92jg" (OuterVolumeSpecName: "kube-api-access-t92jg") pod "78c8a45f-ef42-4aa1-926c-28b52fdf34c7" (UID: "78c8a45f-ef42-4aa1-926c-28b52fdf34c7"). InnerVolumeSpecName "kube-api-access-t92jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.357882 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t92jg\" (UniqueName: \"kubernetes.io/projected/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-kube-api-access-t92jg\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.357911 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.636866 4919 generic.go:334] "Generic (PLEG): container finished" podID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerID="fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32" exitCode=0 Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.636917 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw4c5" event={"ID":"78c8a45f-ef42-4aa1-926c-28b52fdf34c7","Type":"ContainerDied","Data":"fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32"} Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.636973 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw4c5" event={"ID":"78c8a45f-ef42-4aa1-926c-28b52fdf34c7","Type":"ContainerDied","Data":"8574c4cdc8071e47e78a9a59c74e65de2590d9f1de199fc20d862ab8d1638efb"} Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.636980 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw4c5" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.636994 4919 scope.go:117] "RemoveContainer" containerID="fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.656842 4919 scope.go:117] "RemoveContainer" containerID="8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.671483 4919 scope.go:117] "RemoveContainer" containerID="46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.693672 4919 scope.go:117] "RemoveContainer" containerID="fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32" Mar 10 22:39:22 crc kubenswrapper[4919]: E0310 22:39:22.694175 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32\": container with ID starting with fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32 not found: ID does not exist" containerID="fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.694210 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32"} err="failed to get container status \"fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32\": rpc error: code = NotFound desc = could not find container \"fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32\": container with ID starting with fc17af62d1ce09663344d576cd1439a4a374ceab9d4795010a925e948702de32 not found: ID does not exist" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.694228 4919 scope.go:117] "RemoveContainer" containerID="8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8" Mar 10 22:39:22 crc kubenswrapper[4919]: E0310 22:39:22.694911 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8\": container with ID starting with 8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8 not found: ID does not exist" containerID="8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.694963 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8"} err="failed to get container status \"8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8\": rpc error: code = NotFound desc = could not find container \"8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8\": container with ID starting with 8ad9813ce2fe76959f29741b8c37e4c8377f56424e8c9bd6bd6da8d6f3f25dd8 not found: ID does not exist" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.694981 4919 scope.go:117] "RemoveContainer" containerID="46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef" Mar 10 22:39:22 crc kubenswrapper[4919]: E0310 22:39:22.695878 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef\": container with ID starting with 46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef not found: ID does not exist" containerID="46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.695929 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef"} err="failed to get container status \"46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef\": rpc error: code = NotFound desc = could not find container \"46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef\": container with ID starting with 46404a3f13bb1447a6fb8e95d85766dd84e38401aee086fe91d22f21374226ef not found: ID does not exist" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.766816 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78c8a45f-ef42-4aa1-926c-28b52fdf34c7" (UID: "78c8a45f-ef42-4aa1-926c-28b52fdf34c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.864114 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78c8a45f-ef42-4aa1-926c-28b52fdf34c7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:22 crc kubenswrapper[4919]: I0310 22:39:22.994323 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mw4c5"] Mar 10 22:39:23 crc kubenswrapper[4919]: I0310 22:39:23.010860 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mw4c5"] Mar 10 22:39:23 crc kubenswrapper[4919]: I0310 22:39:23.490027 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" path="/var/lib/kubelet/pods/78c8a45f-ef42-4aa1-926c-28b52fdf34c7/volumes" Mar 10 22:39:26 crc kubenswrapper[4919]: I0310 22:39:26.480319 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:39:26 crc kubenswrapper[4919]: E0310 22:39:26.480787 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.206144 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nbn2x"] Mar 10 22:39:27 crc kubenswrapper[4919]: E0310 22:39:27.207235 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerName="extract-utilities" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.207376 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerName="extract-utilities" Mar 10 22:39:27 crc kubenswrapper[4919]: E0310 22:39:27.207546 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerName="registry-server" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.207645 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerName="registry-server" Mar 10 22:39:27 crc kubenswrapper[4919]: E0310 22:39:27.207775 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerName="extract-content" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.207897 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerName="extract-content" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.208245 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c8a45f-ef42-4aa1-926c-28b52fdf34c7" containerName="registry-server" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.210059 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.217868 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbn2x"] Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.273486 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-utilities\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.273540 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-catalog-content\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.273619 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxlrr\" (UniqueName: \"kubernetes.io/projected/01b99c13-5819-4333-8fa8-04b30c1551ad-kube-api-access-xxlrr\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.375288 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-utilities\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.375336 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-catalog-content\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.375433 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxlrr\" (UniqueName: \"kubernetes.io/projected/01b99c13-5819-4333-8fa8-04b30c1551ad-kube-api-access-xxlrr\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.375869 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-utilities\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.376060 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-catalog-content\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.397591 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxlrr\" (UniqueName: \"kubernetes.io/projected/01b99c13-5819-4333-8fa8-04b30c1551ad-kube-api-access-xxlrr\") pod \"community-operators-nbn2x\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:27 crc kubenswrapper[4919]: I0310 22:39:27.593173 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:28 crc kubenswrapper[4919]: I0310 22:39:28.090578 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbn2x"] Mar 10 22:39:28 crc kubenswrapper[4919]: I0310 22:39:28.683655 4919 generic.go:334] "Generic (PLEG): container finished" podID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerID="b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136" exitCode=0 Mar 10 22:39:28 crc kubenswrapper[4919]: I0310 22:39:28.683701 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbn2x" event={"ID":"01b99c13-5819-4333-8fa8-04b30c1551ad","Type":"ContainerDied","Data":"b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136"} Mar 10 22:39:28 crc kubenswrapper[4919]: I0310 22:39:28.683926 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbn2x" event={"ID":"01b99c13-5819-4333-8fa8-04b30c1551ad","Type":"ContainerStarted","Data":"dd170e238fc85a901442fba687f201b3ce9dfcb3a444a3d0a5c4825513c6c669"} Mar 10 22:39:29 crc kubenswrapper[4919]: I0310 22:39:29.692776 4919 generic.go:334] "Generic (PLEG): container finished" podID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerID="6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00" exitCode=0 Mar 10 22:39:29 crc kubenswrapper[4919]: I0310 22:39:29.692821 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbn2x" event={"ID":"01b99c13-5819-4333-8fa8-04b30c1551ad","Type":"ContainerDied","Data":"6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00"} Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.203377 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k7k2l"] Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.205744 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.220922 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7k2l"] Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.312379 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-catalog-content\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.312585 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-utilities\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.312670 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6pmg\" (UniqueName: \"kubernetes.io/projected/1407476a-29ad-4d26-9fdc-511738296858-kube-api-access-x6pmg\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.413498 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6pmg\" (UniqueName: \"kubernetes.io/projected/1407476a-29ad-4d26-9fdc-511738296858-kube-api-access-x6pmg\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.413598 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-catalog-content\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.413672 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-utilities\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.414198 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-catalog-content\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.414229 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-utilities\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.432092 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6pmg\" (UniqueName: \"kubernetes.io/projected/1407476a-29ad-4d26-9fdc-511738296858-kube-api-access-x6pmg\") pod \"redhat-marketplace-k7k2l\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.552799 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.713426 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbn2x" event={"ID":"01b99c13-5819-4333-8fa8-04b30c1551ad","Type":"ContainerStarted","Data":"60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5"} Mar 10 22:39:30 crc kubenswrapper[4919]: I0310 22:39:30.749482 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nbn2x" podStartSLOduration=2.3500176010000002 podStartE2EDuration="3.7494616s" podCreationTimestamp="2026-03-10 22:39:27 +0000 UTC" firstStartedPulling="2026-03-10 22:39:28.685682179 +0000 UTC m=+2955.927562787" lastFinishedPulling="2026-03-10 22:39:30.085126168 +0000 UTC m=+2957.327006786" observedRunningTime="2026-03-10 22:39:30.741648139 +0000 UTC m=+2957.983528747" watchObservedRunningTime="2026-03-10 22:39:30.7494616 +0000 UTC m=+2957.991342208" Mar 10 22:39:31 crc kubenswrapper[4919]: I0310 22:39:31.055204 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7k2l"] Mar 10 22:39:31 crc kubenswrapper[4919]: W0310 22:39:31.060544 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1407476a_29ad_4d26_9fdc_511738296858.slice/crio-96a5b743ebab82c26d499f75777fd6b035800e5e8dc204f22c1515cee5d3103a WatchSource:0}: Error finding container 96a5b743ebab82c26d499f75777fd6b035800e5e8dc204f22c1515cee5d3103a: Status 404 returned error can't find the container with id 96a5b743ebab82c26d499f75777fd6b035800e5e8dc204f22c1515cee5d3103a Mar 10 22:39:31 crc kubenswrapper[4919]: I0310 22:39:31.721482 4919 generic.go:334] "Generic (PLEG): container finished" podID="1407476a-29ad-4d26-9fdc-511738296858" containerID="9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c" exitCode=0 Mar 10 22:39:31 crc kubenswrapper[4919]: I0310 22:39:31.721636 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7k2l" event={"ID":"1407476a-29ad-4d26-9fdc-511738296858","Type":"ContainerDied","Data":"9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c"} Mar 10 22:39:31 crc kubenswrapper[4919]: I0310 22:39:31.721898 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7k2l" event={"ID":"1407476a-29ad-4d26-9fdc-511738296858","Type":"ContainerStarted","Data":"96a5b743ebab82c26d499f75777fd6b035800e5e8dc204f22c1515cee5d3103a"} Mar 10 22:39:32 crc kubenswrapper[4919]: I0310 22:39:32.729054 4919 generic.go:334] "Generic (PLEG): container finished" podID="1407476a-29ad-4d26-9fdc-511738296858" containerID="12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59" exitCode=0 Mar 10 22:39:32 crc kubenswrapper[4919]: I0310 22:39:32.729142 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7k2l" event={"ID":"1407476a-29ad-4d26-9fdc-511738296858","Type":"ContainerDied","Data":"12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59"} Mar 10 22:39:33 crc kubenswrapper[4919]: I0310 22:39:33.736296 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7k2l" event={"ID":"1407476a-29ad-4d26-9fdc-511738296858","Type":"ContainerStarted","Data":"a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d"} Mar 10 22:39:33 crc kubenswrapper[4919]: I0310 22:39:33.757644 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k7k2l" podStartSLOduration=2.3293711200000002 podStartE2EDuration="3.75760139s" podCreationTimestamp="2026-03-10 22:39:30 +0000 UTC" firstStartedPulling="2026-03-10 22:39:31.723361671 +0000 UTC m=+2958.965242269" lastFinishedPulling="2026-03-10 22:39:33.151591931 +0000 UTC m=+2960.393472539" observedRunningTime="2026-03-10 22:39:33.755573434 +0000 UTC m=+2960.997454072" watchObservedRunningTime="2026-03-10 22:39:33.75760139 +0000 UTC m=+2960.999481998" Mar 10 22:39:37 crc kubenswrapper[4919]: I0310 22:39:37.480675 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:39:37 crc kubenswrapper[4919]: E0310 22:39:37.481373 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:39:37 crc kubenswrapper[4919]: I0310 22:39:37.594625 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:37 crc kubenswrapper[4919]: I0310 22:39:37.594712 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:37 crc kubenswrapper[4919]: I0310 22:39:37.653211 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:37 crc kubenswrapper[4919]: I0310 22:39:37.805804 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:37 crc kubenswrapper[4919]: I0310 22:39:37.889188 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbn2x"] Mar 10 22:39:39 crc kubenswrapper[4919]: I0310 22:39:39.778145 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nbn2x" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerName="registry-server" containerID="cri-o://60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5" gracePeriod=2 Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.145976 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.278579 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxlrr\" (UniqueName: \"kubernetes.io/projected/01b99c13-5819-4333-8fa8-04b30c1551ad-kube-api-access-xxlrr\") pod \"01b99c13-5819-4333-8fa8-04b30c1551ad\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.278633 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-utilities\") pod \"01b99c13-5819-4333-8fa8-04b30c1551ad\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.278711 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-catalog-content\") pod \"01b99c13-5819-4333-8fa8-04b30c1551ad\" (UID: \"01b99c13-5819-4333-8fa8-04b30c1551ad\") " Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.280146 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-utilities" (OuterVolumeSpecName: "utilities") pod "01b99c13-5819-4333-8fa8-04b30c1551ad" (UID: "01b99c13-5819-4333-8fa8-04b30c1551ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.290317 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b99c13-5819-4333-8fa8-04b30c1551ad-kube-api-access-xxlrr" (OuterVolumeSpecName: "kube-api-access-xxlrr") pod "01b99c13-5819-4333-8fa8-04b30c1551ad" (UID: "01b99c13-5819-4333-8fa8-04b30c1551ad"). InnerVolumeSpecName "kube-api-access-xxlrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.345812 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01b99c13-5819-4333-8fa8-04b30c1551ad" (UID: "01b99c13-5819-4333-8fa8-04b30c1551ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.380051 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxlrr\" (UniqueName: \"kubernetes.io/projected/01b99c13-5819-4333-8fa8-04b30c1551ad-kube-api-access-xxlrr\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.380308 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.380383 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b99c13-5819-4333-8fa8-04b30c1551ad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.574717 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.574794 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.613607 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.787639 4919 generic.go:334] "Generic (PLEG): container finished" podID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerID="60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5" exitCode=0 Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.787707 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbn2x" event={"ID":"01b99c13-5819-4333-8fa8-04b30c1551ad","Type":"ContainerDied","Data":"60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5"} Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.787755 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbn2x" event={"ID":"01b99c13-5819-4333-8fa8-04b30c1551ad","Type":"ContainerDied","Data":"dd170e238fc85a901442fba687f201b3ce9dfcb3a444a3d0a5c4825513c6c669"} Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.787751 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbn2x" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.787770 4919 scope.go:117] "RemoveContainer" containerID="60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.827007 4919 scope.go:117] "RemoveContainer" containerID="6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.830346 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbn2x"] Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.834950 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.836473 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nbn2x"] Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.846615 4919 scope.go:117] "RemoveContainer" containerID="b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.878597 4919 scope.go:117] "RemoveContainer" containerID="60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5" Mar 10 22:39:40 crc kubenswrapper[4919]: E0310 22:39:40.879277 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5\": container with ID starting with 60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5 not found: ID does not exist" containerID="60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.879323 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5"} err="failed to get container status \"60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5\": rpc error: code = NotFound desc = could not find container \"60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5\": container with ID starting with 60c877ac44a19b282d22263500f7f19e9764077eb15b153f152571a9b6801de5 not found: ID does not exist" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.879353 4919 scope.go:117] "RemoveContainer" containerID="6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00" Mar 10 22:39:40 crc kubenswrapper[4919]: E0310 22:39:40.879703 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00\": container with ID starting with 6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00 not found: ID does not exist" containerID="6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.879755 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00"} err="failed to get container status \"6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00\": rpc error: code = NotFound desc = could not find container \"6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00\": container with ID starting with 6bb858139000563b5a0554a9bfff4227c5b82df2c929343b719a84bad66b9e00 not found: ID does not exist" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.879787 4919 scope.go:117] "RemoveContainer" containerID="b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136" Mar 10 22:39:40 crc kubenswrapper[4919]: E0310 22:39:40.880102 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136\": container with ID starting with b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136 not found: ID does not exist" containerID="b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136" Mar 10 22:39:40 crc kubenswrapper[4919]: I0310 22:39:40.880135 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136"} err="failed to get container status \"b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136\": rpc error: code = NotFound desc = could not find container \"b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136\": container with ID starting with b962371667d6a7b38675fa2a16e1b7625960a4f4f309e8088c23d7ef89700136 not found: ID does not exist" Mar 10 22:39:41 crc kubenswrapper[4919]: I0310 22:39:41.488701 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" path="/var/lib/kubelet/pods/01b99c13-5819-4333-8fa8-04b30c1551ad/volumes" Mar 10 22:39:42 crc kubenswrapper[4919]: I0310 22:39:42.895804 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7k2l"] Mar 10 22:39:42 crc kubenswrapper[4919]: I0310 22:39:42.896203 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k7k2l" podUID="1407476a-29ad-4d26-9fdc-511738296858" containerName="registry-server" containerID="cri-o://a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d" gracePeriod=2 Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.374939 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.526056 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-catalog-content\") pod \"1407476a-29ad-4d26-9fdc-511738296858\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.526124 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-utilities\") pod \"1407476a-29ad-4d26-9fdc-511738296858\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.526209 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6pmg\" (UniqueName: \"kubernetes.io/projected/1407476a-29ad-4d26-9fdc-511738296858-kube-api-access-x6pmg\") pod \"1407476a-29ad-4d26-9fdc-511738296858\" (UID: \"1407476a-29ad-4d26-9fdc-511738296858\") " Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.527154 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-utilities" (OuterVolumeSpecName: "utilities") pod "1407476a-29ad-4d26-9fdc-511738296858" (UID: "1407476a-29ad-4d26-9fdc-511738296858"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.535588 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1407476a-29ad-4d26-9fdc-511738296858-kube-api-access-x6pmg" (OuterVolumeSpecName: "kube-api-access-x6pmg") pod "1407476a-29ad-4d26-9fdc-511738296858" (UID: "1407476a-29ad-4d26-9fdc-511738296858"). InnerVolumeSpecName "kube-api-access-x6pmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.550847 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1407476a-29ad-4d26-9fdc-511738296858" (UID: "1407476a-29ad-4d26-9fdc-511738296858"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.628403 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.628435 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1407476a-29ad-4d26-9fdc-511738296858-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.628445 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6pmg\" (UniqueName: \"kubernetes.io/projected/1407476a-29ad-4d26-9fdc-511738296858-kube-api-access-x6pmg\") on node \"crc\" DevicePath \"\"" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.813160 4919 generic.go:334] "Generic (PLEG): container finished" podID="1407476a-29ad-4d26-9fdc-511738296858" containerID="a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d" exitCode=0 Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.813209 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7k2l" event={"ID":"1407476a-29ad-4d26-9fdc-511738296858","Type":"ContainerDied","Data":"a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d"} Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.813241 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7k2l" event={"ID":"1407476a-29ad-4d26-9fdc-511738296858","Type":"ContainerDied","Data":"96a5b743ebab82c26d499f75777fd6b035800e5e8dc204f22c1515cee5d3103a"} Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.813263 4919 scope.go:117] "RemoveContainer" containerID="a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.813368 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7k2l" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.830570 4919 scope.go:117] "RemoveContainer" containerID="12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.848258 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7k2l"] Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.853215 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7k2l"] Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.855912 4919 scope.go:117] "RemoveContainer" containerID="9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.873122 4919 scope.go:117] "RemoveContainer" containerID="a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d" Mar 10 22:39:43 crc kubenswrapper[4919]: E0310 22:39:43.873584 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d\": container with ID starting with a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d not found: ID does not exist" containerID="a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.873629 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d"} err="failed to get container status \"a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d\": rpc error: code = NotFound desc = could not find container \"a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d\": container with ID starting with a648e3fb19bd716022d3de433c51f444d0bf872f02a6ec7613ef257eda54350d not found: ID does not exist" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.873660 4919 scope.go:117] "RemoveContainer" containerID="12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59" Mar 10 22:39:43 crc kubenswrapper[4919]: E0310 22:39:43.874215 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59\": container with ID starting with 12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59 not found: ID does not exist" containerID="12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.874243 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59"} err="failed to get container status \"12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59\": rpc error: code = NotFound desc = could not find container \"12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59\": container with ID starting with 12d8a03510f4eafe96b010935e1e0642db4ed1357d73c670a8260c63f58e1d59 not found: ID does not exist" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.874261 4919 scope.go:117] "RemoveContainer" containerID="9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c" Mar 10 22:39:43 crc kubenswrapper[4919]: E0310 22:39:43.874606 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c\": container with ID starting with 9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c not found: ID does not exist" containerID="9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c" Mar 10 22:39:43 crc kubenswrapper[4919]: I0310 22:39:43.874627 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c"} err="failed to get container status \"9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c\": rpc error: code = NotFound desc = could not find container \"9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c\": container with ID starting with 9b5126891c08f787be1814b304883bb41278d7eeeee812c0d89024815871dc1c not found: ID does not exist" Mar 10 22:39:45 crc kubenswrapper[4919]: I0310 22:39:45.488873 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1407476a-29ad-4d26-9fdc-511738296858" path="/var/lib/kubelet/pods/1407476a-29ad-4d26-9fdc-511738296858/volumes" Mar 10 22:39:50 crc kubenswrapper[4919]: I0310 22:39:50.479900 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:39:50 crc kubenswrapper[4919]: E0310 22:39:50.480520 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.162850 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553040-25bgh"] Mar 10 22:40:00 crc kubenswrapper[4919]: E0310 22:40:00.163583 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerName="extract-content" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.163603 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerName="extract-content" Mar 10 22:40:00 crc kubenswrapper[4919]: E0310 22:40:00.163616 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerName="registry-server" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.163626 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerName="registry-server" Mar 10 22:40:00 crc kubenswrapper[4919]: E0310 22:40:00.163645 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1407476a-29ad-4d26-9fdc-511738296858" containerName="registry-server" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.163657 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1407476a-29ad-4d26-9fdc-511738296858" containerName="registry-server" Mar 10 22:40:00 crc kubenswrapper[4919]: E0310 22:40:00.163674 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1407476a-29ad-4d26-9fdc-511738296858" containerName="extract-utilities" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.163684 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1407476a-29ad-4d26-9fdc-511738296858" containerName="extract-utilities" Mar 10 22:40:00 crc kubenswrapper[4919]: E0310 22:40:00.163704 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1407476a-29ad-4d26-9fdc-511738296858" containerName="extract-content" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.163713 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1407476a-29ad-4d26-9fdc-511738296858" containerName="extract-content" Mar 10 22:40:00 crc kubenswrapper[4919]: E0310 22:40:00.163735 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerName="extract-utilities" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.163746 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerName="extract-utilities" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.163960 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1407476a-29ad-4d26-9fdc-511738296858" containerName="registry-server" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.163983 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b99c13-5819-4333-8fa8-04b30c1551ad" containerName="registry-server" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.164685 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553040-25bgh" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.167609 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.167893 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.168561 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.187134 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553040-25bgh"] Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.261110 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfrjg\" (UniqueName: \"kubernetes.io/projected/9db28fa3-0ef9-40b9-ba85-aa65b911d979-kube-api-access-sfrjg\") pod \"auto-csr-approver-29553040-25bgh\" (UID: \"9db28fa3-0ef9-40b9-ba85-aa65b911d979\") " pod="openshift-infra/auto-csr-approver-29553040-25bgh" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.362414 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfrjg\" (UniqueName: \"kubernetes.io/projected/9db28fa3-0ef9-40b9-ba85-aa65b911d979-kube-api-access-sfrjg\") pod \"auto-csr-approver-29553040-25bgh\" (UID: \"9db28fa3-0ef9-40b9-ba85-aa65b911d979\") " pod="openshift-infra/auto-csr-approver-29553040-25bgh" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.379301 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfrjg\" (UniqueName: \"kubernetes.io/projected/9db28fa3-0ef9-40b9-ba85-aa65b911d979-kube-api-access-sfrjg\") pod \"auto-csr-approver-29553040-25bgh\" (UID: \"9db28fa3-0ef9-40b9-ba85-aa65b911d979\") " pod="openshift-infra/auto-csr-approver-29553040-25bgh" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.489352 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553040-25bgh" Mar 10 22:40:00 crc kubenswrapper[4919]: I0310 22:40:00.950588 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553040-25bgh"] Mar 10 22:40:01 crc kubenswrapper[4919]: I0310 22:40:01.946035 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553040-25bgh" event={"ID":"9db28fa3-0ef9-40b9-ba85-aa65b911d979","Type":"ContainerStarted","Data":"54a910f47be534d94cc01712c0c7aa802ec21fcaef3564cdfc603e35606208fd"} Mar 10 22:40:02 crc kubenswrapper[4919]: I0310 22:40:02.953429 4919 generic.go:334] "Generic (PLEG): container finished" podID="9db28fa3-0ef9-40b9-ba85-aa65b911d979" containerID="1462698ac8917a6a6ccf8b94b11c5d10de658a9672ad20caeb438544c249ab38" exitCode=0 Mar 10 22:40:02 crc kubenswrapper[4919]: I0310 22:40:02.953593 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553040-25bgh" event={"ID":"9db28fa3-0ef9-40b9-ba85-aa65b911d979","Type":"ContainerDied","Data":"1462698ac8917a6a6ccf8b94b11c5d10de658a9672ad20caeb438544c249ab38"} Mar 10 22:40:03 crc kubenswrapper[4919]: I0310 22:40:03.485089 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:40:03 crc kubenswrapper[4919]: E0310 22:40:03.485335 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:40:04 crc kubenswrapper[4919]: I0310 22:40:04.324915 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553040-25bgh" Mar 10 22:40:04 crc kubenswrapper[4919]: I0310 22:40:04.424076 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfrjg\" (UniqueName: \"kubernetes.io/projected/9db28fa3-0ef9-40b9-ba85-aa65b911d979-kube-api-access-sfrjg\") pod \"9db28fa3-0ef9-40b9-ba85-aa65b911d979\" (UID: \"9db28fa3-0ef9-40b9-ba85-aa65b911d979\") " Mar 10 22:40:04 crc kubenswrapper[4919]: I0310 22:40:04.436747 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db28fa3-0ef9-40b9-ba85-aa65b911d979-kube-api-access-sfrjg" (OuterVolumeSpecName: "kube-api-access-sfrjg") pod "9db28fa3-0ef9-40b9-ba85-aa65b911d979" (UID: "9db28fa3-0ef9-40b9-ba85-aa65b911d979"). InnerVolumeSpecName "kube-api-access-sfrjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:40:04 crc kubenswrapper[4919]: I0310 22:40:04.525227 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfrjg\" (UniqueName: \"kubernetes.io/projected/9db28fa3-0ef9-40b9-ba85-aa65b911d979-kube-api-access-sfrjg\") on node \"crc\" DevicePath \"\"" Mar 10 22:40:04 crc kubenswrapper[4919]: I0310 22:40:04.970375 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553040-25bgh" event={"ID":"9db28fa3-0ef9-40b9-ba85-aa65b911d979","Type":"ContainerDied","Data":"54a910f47be534d94cc01712c0c7aa802ec21fcaef3564cdfc603e35606208fd"} Mar 10 22:40:04 crc kubenswrapper[4919]: I0310 22:40:04.971022 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a910f47be534d94cc01712c0c7aa802ec21fcaef3564cdfc603e35606208fd" Mar 10 22:40:04 crc kubenswrapper[4919]: I0310 22:40:04.970435 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553040-25bgh" Mar 10 22:40:05 crc kubenswrapper[4919]: I0310 22:40:05.401107 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553034-l6slc"] Mar 10 22:40:05 crc kubenswrapper[4919]: I0310 22:40:05.405947 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553034-l6slc"] Mar 10 22:40:05 crc kubenswrapper[4919]: I0310 22:40:05.488220 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2821ca8-8e4c-408e-a6ab-81206c355afb" path="/var/lib/kubelet/pods/e2821ca8-8e4c-408e-a6ab-81206c355afb/volumes" Mar 10 22:40:16 crc kubenswrapper[4919]: I0310 22:40:16.479615 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:40:16 crc kubenswrapper[4919]: E0310 22:40:16.480317 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:40:29 crc kubenswrapper[4919]: I0310 22:40:29.479988 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:40:29 crc kubenswrapper[4919]: E0310 22:40:29.480795 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:40:44 crc kubenswrapper[4919]: I0310 22:40:44.479955 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:40:44 crc kubenswrapper[4919]: E0310 22:40:44.481649 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:40:44 crc kubenswrapper[4919]: I0310 22:40:44.497747 4919 scope.go:117] "RemoveContainer" containerID="67064bf0f66214658f8b8f245f2943060e5402c2e43126363b040cfce267001e" Mar 10 22:40:56 crc kubenswrapper[4919]: I0310 22:40:56.484010 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:40:56 crc kubenswrapper[4919]: E0310 22:40:56.484824 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:41:07 crc kubenswrapper[4919]: I0310 22:41:07.480119 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:41:07 crc kubenswrapper[4919]: E0310 22:41:07.481354 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:41:20 crc kubenswrapper[4919]: I0310 22:41:20.479804 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:41:20 crc kubenswrapper[4919]: E0310 22:41:20.480615 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:41:31 crc kubenswrapper[4919]: I0310 22:41:31.480555 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:41:31 crc kubenswrapper[4919]: E0310 22:41:31.481212 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:41:45 crc kubenswrapper[4919]: I0310 22:41:45.480571 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:41:45 crc kubenswrapper[4919]: E0310 22:41:45.481448 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:41:58 crc kubenswrapper[4919]: I0310 22:41:58.480663 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:41:58 crc kubenswrapper[4919]: E0310 22:41:58.481737 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.153462 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553042-zfjds"] Mar 10 22:42:00 crc kubenswrapper[4919]: E0310 22:42:00.165267 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db28fa3-0ef9-40b9-ba85-aa65b911d979" containerName="oc" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.165313 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db28fa3-0ef9-40b9-ba85-aa65b911d979" containerName="oc" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.165754 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db28fa3-0ef9-40b9-ba85-aa65b911d979" containerName="oc" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.168849 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553042-zfjds"] Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.168956 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553042-zfjds" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.172499 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.172682 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.174087 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.297481 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvrt\" (UniqueName: \"kubernetes.io/projected/38361e86-2631-471b-aee4-b0b0b8da613b-kube-api-access-ltvrt\") pod \"auto-csr-approver-29553042-zfjds\" (UID: \"38361e86-2631-471b-aee4-b0b0b8da613b\") " pod="openshift-infra/auto-csr-approver-29553042-zfjds" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.398534 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvrt\" (UniqueName: \"kubernetes.io/projected/38361e86-2631-471b-aee4-b0b0b8da613b-kube-api-access-ltvrt\") pod \"auto-csr-approver-29553042-zfjds\" (UID: \"38361e86-2631-471b-aee4-b0b0b8da613b\") " pod="openshift-infra/auto-csr-approver-29553042-zfjds" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.423853 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvrt\" (UniqueName: \"kubernetes.io/projected/38361e86-2631-471b-aee4-b0b0b8da613b-kube-api-access-ltvrt\") pod \"auto-csr-approver-29553042-zfjds\" (UID: \"38361e86-2631-471b-aee4-b0b0b8da613b\") " pod="openshift-infra/auto-csr-approver-29553042-zfjds" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.502504 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553042-zfjds" Mar 10 22:42:00 crc kubenswrapper[4919]: I0310 22:42:00.968330 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553042-zfjds"] Mar 10 22:42:01 crc kubenswrapper[4919]: I0310 22:42:01.919926 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553042-zfjds" event={"ID":"38361e86-2631-471b-aee4-b0b0b8da613b","Type":"ContainerStarted","Data":"799a44de74f9063b4b5f2a028d23bc0c1646fe707e53a23cec074e603e6e3cfd"} Mar 10 22:42:02 crc kubenswrapper[4919]: I0310 22:42:02.929822 4919 generic.go:334] "Generic (PLEG): container finished" podID="38361e86-2631-471b-aee4-b0b0b8da613b" containerID="e42bba54a0285355ef71ccbd62ef4211da3ffda3eab8d1dc9eee80d63c3e59ea" exitCode=0 Mar 10 22:42:02 crc kubenswrapper[4919]: I0310 22:42:02.929887 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553042-zfjds" event={"ID":"38361e86-2631-471b-aee4-b0b0b8da613b","Type":"ContainerDied","Data":"e42bba54a0285355ef71ccbd62ef4211da3ffda3eab8d1dc9eee80d63c3e59ea"} Mar 10 22:42:04 crc kubenswrapper[4919]: I0310 22:42:04.169595 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553042-zfjds" Mar 10 22:42:04 crc kubenswrapper[4919]: I0310 22:42:04.363657 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltvrt\" (UniqueName: \"kubernetes.io/projected/38361e86-2631-471b-aee4-b0b0b8da613b-kube-api-access-ltvrt\") pod \"38361e86-2631-471b-aee4-b0b0b8da613b\" (UID: \"38361e86-2631-471b-aee4-b0b0b8da613b\") " Mar 10 22:42:04 crc kubenswrapper[4919]: I0310 22:42:04.370880 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38361e86-2631-471b-aee4-b0b0b8da613b-kube-api-access-ltvrt" (OuterVolumeSpecName: "kube-api-access-ltvrt") pod "38361e86-2631-471b-aee4-b0b0b8da613b" (UID: "38361e86-2631-471b-aee4-b0b0b8da613b"). InnerVolumeSpecName "kube-api-access-ltvrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:42:04 crc kubenswrapper[4919]: I0310 22:42:04.465286 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltvrt\" (UniqueName: \"kubernetes.io/projected/38361e86-2631-471b-aee4-b0b0b8da613b-kube-api-access-ltvrt\") on node \"crc\" DevicePath \"\"" Mar 10 22:42:04 crc kubenswrapper[4919]: I0310 22:42:04.944741 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553042-zfjds" event={"ID":"38361e86-2631-471b-aee4-b0b0b8da613b","Type":"ContainerDied","Data":"799a44de74f9063b4b5f2a028d23bc0c1646fe707e53a23cec074e603e6e3cfd"} Mar 10 22:42:04 crc kubenswrapper[4919]: I0310 22:42:04.944778 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="799a44de74f9063b4b5f2a028d23bc0c1646fe707e53a23cec074e603e6e3cfd" Mar 10 22:42:04 crc kubenswrapper[4919]: I0310 22:42:04.945074 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553042-zfjds" Mar 10 22:42:05 crc kubenswrapper[4919]: I0310 22:42:05.238377 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553036-s9zgb"] Mar 10 22:42:05 crc kubenswrapper[4919]: I0310 22:42:05.248713 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553036-s9zgb"] Mar 10 22:42:05 crc kubenswrapper[4919]: I0310 22:42:05.492065 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77c6040b-e446-44b1-8ef8-05c4987a9371" path="/var/lib/kubelet/pods/77c6040b-e446-44b1-8ef8-05c4987a9371/volumes" Mar 10 22:42:11 crc kubenswrapper[4919]: I0310 22:42:11.480815 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:42:11 crc kubenswrapper[4919]: E0310 22:42:11.482181 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:42:26 crc kubenswrapper[4919]: I0310 22:42:26.479904 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:42:26 crc kubenswrapper[4919]: E0310 22:42:26.480859 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:42:39 crc kubenswrapper[4919]: I0310 22:42:39.479937 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:42:39 crc kubenswrapper[4919]: E0310 22:42:39.480732 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:42:44 crc kubenswrapper[4919]: I0310 22:42:44.583724 4919 scope.go:117] "RemoveContainer" containerID="4e02ec22750d4cf8cbcdc628e33726d4ef1f4c242fb06990dee53c9b5327e6fb" Mar 10 22:42:53 crc kubenswrapper[4919]: I0310 22:42:53.496828 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:42:53 crc kubenswrapper[4919]: E0310 22:42:53.497871 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:43:08 crc kubenswrapper[4919]: I0310 22:43:08.479942 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:43:08 crc kubenswrapper[4919]: E0310 22:43:08.480855 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:43:19 crc kubenswrapper[4919]: I0310 22:43:19.481145 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:43:19 crc kubenswrapper[4919]: E0310 22:43:19.482268 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:43:31 crc kubenswrapper[4919]: I0310 22:43:31.480300 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:43:31 crc kubenswrapper[4919]: E0310 22:43:31.482094 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:43:42 crc kubenswrapper[4919]: I0310 22:43:42.480481 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:43:42 crc kubenswrapper[4919]: E0310 22:43:42.481757 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:43:57 crc kubenswrapper[4919]: I0310 22:43:57.480533 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:43:57 crc kubenswrapper[4919]: E0310 22:43:57.481267 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.139632 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553044-rddsc"] Mar 10 22:44:00 crc kubenswrapper[4919]: E0310 22:44:00.140223 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38361e86-2631-471b-aee4-b0b0b8da613b" containerName="oc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.140236 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="38361e86-2631-471b-aee4-b0b0b8da613b" containerName="oc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.140366 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="38361e86-2631-471b-aee4-b0b0b8da613b" containerName="oc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.140787 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553044-rddsc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.142759 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.143629 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.143614 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.156916 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553044-rddsc"] Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.215553 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mlb\" (UniqueName: \"kubernetes.io/projected/69bff7cf-85ff-450d-a7e4-5b2236343394-kube-api-access-88mlb\") pod \"auto-csr-approver-29553044-rddsc\" (UID: \"69bff7cf-85ff-450d-a7e4-5b2236343394\") " pod="openshift-infra/auto-csr-approver-29553044-rddsc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.317153 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mlb\" (UniqueName: \"kubernetes.io/projected/69bff7cf-85ff-450d-a7e4-5b2236343394-kube-api-access-88mlb\") pod \"auto-csr-approver-29553044-rddsc\" (UID: \"69bff7cf-85ff-450d-a7e4-5b2236343394\") " pod="openshift-infra/auto-csr-approver-29553044-rddsc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.338495 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mlb\" (UniqueName: \"kubernetes.io/projected/69bff7cf-85ff-450d-a7e4-5b2236343394-kube-api-access-88mlb\") pod \"auto-csr-approver-29553044-rddsc\" (UID: \"69bff7cf-85ff-450d-a7e4-5b2236343394\") " pod="openshift-infra/auto-csr-approver-29553044-rddsc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.474677 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553044-rddsc" Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.733550 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553044-rddsc"] Mar 10 22:44:00 crc kubenswrapper[4919]: I0310 22:44:00.903263 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553044-rddsc" event={"ID":"69bff7cf-85ff-450d-a7e4-5b2236343394","Type":"ContainerStarted","Data":"c20e2f2225d4464bd2d73bc8db1ceee47ca87f583c900b12c97a338ef75b3168"} Mar 10 22:44:02 crc kubenswrapper[4919]: I0310 22:44:02.921069 4919 generic.go:334] "Generic (PLEG): container finished" podID="69bff7cf-85ff-450d-a7e4-5b2236343394" containerID="24b071d9c6afe96c7c4104a8e4eb680b62e1194e450f831421ac6d143cc7dca4" exitCode=0 Mar 10 22:44:02 crc kubenswrapper[4919]: I0310 22:44:02.921178 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553044-rddsc" event={"ID":"69bff7cf-85ff-450d-a7e4-5b2236343394","Type":"ContainerDied","Data":"24b071d9c6afe96c7c4104a8e4eb680b62e1194e450f831421ac6d143cc7dca4"} Mar 10 22:44:04 crc kubenswrapper[4919]: I0310 22:44:04.202476 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553044-rddsc" Mar 10 22:44:04 crc kubenswrapper[4919]: I0310 22:44:04.276209 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88mlb\" (UniqueName: \"kubernetes.io/projected/69bff7cf-85ff-450d-a7e4-5b2236343394-kube-api-access-88mlb\") pod \"69bff7cf-85ff-450d-a7e4-5b2236343394\" (UID: \"69bff7cf-85ff-450d-a7e4-5b2236343394\") " Mar 10 22:44:04 crc kubenswrapper[4919]: I0310 22:44:04.282498 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69bff7cf-85ff-450d-a7e4-5b2236343394-kube-api-access-88mlb" (OuterVolumeSpecName: "kube-api-access-88mlb") pod "69bff7cf-85ff-450d-a7e4-5b2236343394" (UID: "69bff7cf-85ff-450d-a7e4-5b2236343394"). InnerVolumeSpecName "kube-api-access-88mlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:44:04 crc kubenswrapper[4919]: I0310 22:44:04.377379 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88mlb\" (UniqueName: \"kubernetes.io/projected/69bff7cf-85ff-450d-a7e4-5b2236343394-kube-api-access-88mlb\") on node \"crc\" DevicePath \"\"" Mar 10 22:44:04 crc kubenswrapper[4919]: I0310 22:44:04.936652 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553044-rddsc" event={"ID":"69bff7cf-85ff-450d-a7e4-5b2236343394","Type":"ContainerDied","Data":"c20e2f2225d4464bd2d73bc8db1ceee47ca87f583c900b12c97a338ef75b3168"} Mar 10 22:44:04 crc kubenswrapper[4919]: I0310 22:44:04.936706 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c20e2f2225d4464bd2d73bc8db1ceee47ca87f583c900b12c97a338ef75b3168" Mar 10 22:44:04 crc kubenswrapper[4919]: I0310 22:44:04.936741 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553044-rddsc" Mar 10 22:44:05 crc kubenswrapper[4919]: I0310 22:44:05.312589 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553038-vlnw9"] Mar 10 22:44:05 crc kubenswrapper[4919]: I0310 22:44:05.319126 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553038-vlnw9"] Mar 10 22:44:05 crc kubenswrapper[4919]: I0310 22:44:05.487599 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329ba3ec-70e3-4c27-b210-43be1d6594e4" path="/var/lib/kubelet/pods/329ba3ec-70e3-4c27-b210-43be1d6594e4/volumes" Mar 10 22:44:09 crc kubenswrapper[4919]: I0310 22:44:09.482178 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:44:09 crc kubenswrapper[4919]: I0310 22:44:09.979062 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"372471d7ebc7335110031f4e58477a54f513fd65aed93d36e38c44ab11d01d23"} Mar 10 22:44:44 crc kubenswrapper[4919]: I0310 22:44:44.696223 4919 scope.go:117] "RemoveContainer" containerID="a867dc8d5248f236679e1bdcb4654a3e59458a158c24367f6b4ed55a071345d9" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.173929 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg"] Mar 10 22:45:00 crc kubenswrapper[4919]: E0310 22:45:00.174900 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bff7cf-85ff-450d-a7e4-5b2236343394" containerName="oc" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.174916 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bff7cf-85ff-450d-a7e4-5b2236343394" containerName="oc" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.175114 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bff7cf-85ff-450d-a7e4-5b2236343394" containerName="oc" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.175785 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.179537 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg"] Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.201868 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.204613 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.221803 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-config-volume\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.222252 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-secret-volume\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.222301 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmmdq\" (UniqueName: \"kubernetes.io/projected/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-kube-api-access-tmmdq\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.323415 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-secret-volume\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.323479 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmmdq\" (UniqueName: \"kubernetes.io/projected/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-kube-api-access-tmmdq\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.323547 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-config-volume\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.324527 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-config-volume\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.330980 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-secret-volume\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.343582 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmmdq\" (UniqueName: \"kubernetes.io/projected/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-kube-api-access-tmmdq\") pod \"collect-profiles-29553045-qjtdg\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.525068 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:00 crc kubenswrapper[4919]: I0310 22:45:00.967443 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg"] Mar 10 22:45:01 crc kubenswrapper[4919]: I0310 22:45:01.433167 4919 generic.go:334] "Generic (PLEG): container finished" podID="ae254e6e-926c-44a5-b39e-c8cebf45b5a6" containerID="bfeb4f6e055eb98f397864a2be27b7b069aae2fe12f7686eaf704a8d64d702cb" exitCode=0 Mar 10 22:45:01 crc kubenswrapper[4919]: I0310 22:45:01.433209 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" event={"ID":"ae254e6e-926c-44a5-b39e-c8cebf45b5a6","Type":"ContainerDied","Data":"bfeb4f6e055eb98f397864a2be27b7b069aae2fe12f7686eaf704a8d64d702cb"} Mar 10 22:45:01 crc kubenswrapper[4919]: I0310 22:45:01.433235 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" event={"ID":"ae254e6e-926c-44a5-b39e-c8cebf45b5a6","Type":"ContainerStarted","Data":"ca1b3eece52c2799989ccc7420a0b0c729391014c5032fc59d7fdfb2903f1643"} Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.768492 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.858702 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-secret-volume\") pod \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.858811 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmmdq\" (UniqueName: \"kubernetes.io/projected/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-kube-api-access-tmmdq\") pod \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.858844 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-config-volume\") pod \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\" (UID: \"ae254e6e-926c-44a5-b39e-c8cebf45b5a6\") " Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.859507 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-config-volume" (OuterVolumeSpecName: "config-volume") pod "ae254e6e-926c-44a5-b39e-c8cebf45b5a6" (UID: "ae254e6e-926c-44a5-b39e-c8cebf45b5a6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.864106 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ae254e6e-926c-44a5-b39e-c8cebf45b5a6" (UID: "ae254e6e-926c-44a5-b39e-c8cebf45b5a6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.864293 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-kube-api-access-tmmdq" (OuterVolumeSpecName: "kube-api-access-tmmdq") pod "ae254e6e-926c-44a5-b39e-c8cebf45b5a6" (UID: "ae254e6e-926c-44a5-b39e-c8cebf45b5a6"). InnerVolumeSpecName "kube-api-access-tmmdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.961006 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.961066 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmmdq\" (UniqueName: \"kubernetes.io/projected/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-kube-api-access-tmmdq\") on node \"crc\" DevicePath \"\"" Mar 10 22:45:02 crc kubenswrapper[4919]: I0310 22:45:02.961076 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae254e6e-926c-44a5-b39e-c8cebf45b5a6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 22:45:03 crc kubenswrapper[4919]: I0310 22:45:03.448685 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" event={"ID":"ae254e6e-926c-44a5-b39e-c8cebf45b5a6","Type":"ContainerDied","Data":"ca1b3eece52c2799989ccc7420a0b0c729391014c5032fc59d7fdfb2903f1643"} Mar 10 22:45:03 crc kubenswrapper[4919]: I0310 22:45:03.448720 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca1b3eece52c2799989ccc7420a0b0c729391014c5032fc59d7fdfb2903f1643" Mar 10 22:45:03 crc kubenswrapper[4919]: I0310 22:45:03.448736 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg" Mar 10 22:45:03 crc kubenswrapper[4919]: I0310 22:45:03.875618 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk"] Mar 10 22:45:03 crc kubenswrapper[4919]: I0310 22:45:03.887231 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553000-wwwdk"] Mar 10 22:45:05 crc kubenswrapper[4919]: I0310 22:45:05.487881 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f69f107-0134-45bd-b8f5-c272fd4b8fdc" path="/var/lib/kubelet/pods/5f69f107-0134-45bd-b8f5-c272fd4b8fdc/volumes" Mar 10 22:45:44 crc kubenswrapper[4919]: I0310 22:45:44.786070 4919 scope.go:117] "RemoveContainer" containerID="67082a6347561a14fbe1f05df26dd9199717d40bdfd24a90a1441c085344f072" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.161832 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553046-kwmz8"] Mar 10 22:46:00 crc kubenswrapper[4919]: E0310 22:46:00.163018 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae254e6e-926c-44a5-b39e-c8cebf45b5a6" containerName="collect-profiles" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.163040 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae254e6e-926c-44a5-b39e-c8cebf45b5a6" containerName="collect-profiles" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.163274 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae254e6e-926c-44a5-b39e-c8cebf45b5a6" containerName="collect-profiles" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.163988 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553046-kwmz8" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.167378 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.167528 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.167579 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.185612 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553046-kwmz8"] Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.328276 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wk2r\" (UniqueName: \"kubernetes.io/projected/c85c10bd-7458-4c76-8e95-8100a55aa96e-kube-api-access-8wk2r\") pod \"auto-csr-approver-29553046-kwmz8\" (UID: \"c85c10bd-7458-4c76-8e95-8100a55aa96e\") " pod="openshift-infra/auto-csr-approver-29553046-kwmz8" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.429599 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wk2r\" (UniqueName: \"kubernetes.io/projected/c85c10bd-7458-4c76-8e95-8100a55aa96e-kube-api-access-8wk2r\") pod \"auto-csr-approver-29553046-kwmz8\" (UID: \"c85c10bd-7458-4c76-8e95-8100a55aa96e\") " pod="openshift-infra/auto-csr-approver-29553046-kwmz8" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.451000 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wk2r\" (UniqueName: \"kubernetes.io/projected/c85c10bd-7458-4c76-8e95-8100a55aa96e-kube-api-access-8wk2r\") pod \"auto-csr-approver-29553046-kwmz8\" (UID: \"c85c10bd-7458-4c76-8e95-8100a55aa96e\") " pod="openshift-infra/auto-csr-approver-29553046-kwmz8" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.502362 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553046-kwmz8" Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.941619 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553046-kwmz8"] Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.951812 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:46:00 crc kubenswrapper[4919]: I0310 22:46:00.971149 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553046-kwmz8" event={"ID":"c85c10bd-7458-4c76-8e95-8100a55aa96e","Type":"ContainerStarted","Data":"e450f013e031649e35404bfb83e7cf915f2b6a2886ba80a3f53f4c61b8e1d7ea"} Mar 10 22:46:02 crc kubenswrapper[4919]: I0310 22:46:02.990230 4919 generic.go:334] "Generic (PLEG): container finished" podID="c85c10bd-7458-4c76-8e95-8100a55aa96e" containerID="07663a94e640ea3eaacb412322f9944eda606dc071acd3577c5e49e7d975048e" exitCode=0 Mar 10 22:46:02 crc kubenswrapper[4919]: I0310 22:46:02.990347 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553046-kwmz8" event={"ID":"c85c10bd-7458-4c76-8e95-8100a55aa96e","Type":"ContainerDied","Data":"07663a94e640ea3eaacb412322f9944eda606dc071acd3577c5e49e7d975048e"} Mar 10 22:46:04 crc kubenswrapper[4919]: I0310 22:46:04.319857 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553046-kwmz8" Mar 10 22:46:04 crc kubenswrapper[4919]: I0310 22:46:04.391892 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wk2r\" (UniqueName: \"kubernetes.io/projected/c85c10bd-7458-4c76-8e95-8100a55aa96e-kube-api-access-8wk2r\") pod \"c85c10bd-7458-4c76-8e95-8100a55aa96e\" (UID: \"c85c10bd-7458-4c76-8e95-8100a55aa96e\") " Mar 10 22:46:04 crc kubenswrapper[4919]: I0310 22:46:04.397553 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85c10bd-7458-4c76-8e95-8100a55aa96e-kube-api-access-8wk2r" (OuterVolumeSpecName: "kube-api-access-8wk2r") pod "c85c10bd-7458-4c76-8e95-8100a55aa96e" (UID: "c85c10bd-7458-4c76-8e95-8100a55aa96e"). InnerVolumeSpecName "kube-api-access-8wk2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:46:04 crc kubenswrapper[4919]: I0310 22:46:04.493499 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wk2r\" (UniqueName: \"kubernetes.io/projected/c85c10bd-7458-4c76-8e95-8100a55aa96e-kube-api-access-8wk2r\") on node \"crc\" DevicePath \"\"" Mar 10 22:46:05 crc kubenswrapper[4919]: I0310 22:46:05.003479 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553046-kwmz8" event={"ID":"c85c10bd-7458-4c76-8e95-8100a55aa96e","Type":"ContainerDied","Data":"e450f013e031649e35404bfb83e7cf915f2b6a2886ba80a3f53f4c61b8e1d7ea"} Mar 10 22:46:05 crc kubenswrapper[4919]: I0310 22:46:05.003523 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e450f013e031649e35404bfb83e7cf915f2b6a2886ba80a3f53f4c61b8e1d7ea" Mar 10 22:46:05 crc kubenswrapper[4919]: I0310 22:46:05.003548 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553046-kwmz8" Mar 10 22:46:05 crc kubenswrapper[4919]: I0310 22:46:05.393826 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553040-25bgh"] Mar 10 22:46:05 crc kubenswrapper[4919]: I0310 22:46:05.402264 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553040-25bgh"] Mar 10 22:46:05 crc kubenswrapper[4919]: I0310 22:46:05.487905 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db28fa3-0ef9-40b9-ba85-aa65b911d979" path="/var/lib/kubelet/pods/9db28fa3-0ef9-40b9-ba85-aa65b911d979/volumes" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.675254 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j5ntx"] Mar 10 22:46:16 crc kubenswrapper[4919]: E0310 22:46:16.676113 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85c10bd-7458-4c76-8e95-8100a55aa96e" containerName="oc" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.676129 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85c10bd-7458-4c76-8e95-8100a55aa96e" containerName="oc" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.676293 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85c10bd-7458-4c76-8e95-8100a55aa96e" containerName="oc" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.677486 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.685339 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5ntx"] Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.777351 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-catalog-content\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.777480 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-524fm\" (UniqueName: \"kubernetes.io/projected/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-kube-api-access-524fm\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.777506 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-utilities\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.879263 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-catalog-content\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.879361 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-524fm\" (UniqueName: \"kubernetes.io/projected/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-kube-api-access-524fm\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.879404 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-utilities\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.880420 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-utilities\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.880642 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-catalog-content\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:16 crc kubenswrapper[4919]: I0310 22:46:16.903520 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-524fm\" (UniqueName: \"kubernetes.io/projected/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-kube-api-access-524fm\") pod \"redhat-operators-j5ntx\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:17 crc kubenswrapper[4919]: I0310 22:46:17.011820 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:17 crc kubenswrapper[4919]: I0310 22:46:17.254561 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5ntx"] Mar 10 22:46:18 crc kubenswrapper[4919]: I0310 22:46:18.134749 4919 generic.go:334] "Generic (PLEG): container finished" podID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerID="a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c" exitCode=0 Mar 10 22:46:18 crc kubenswrapper[4919]: I0310 22:46:18.134855 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5ntx" event={"ID":"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12","Type":"ContainerDied","Data":"a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c"} Mar 10 22:46:18 crc kubenswrapper[4919]: I0310 22:46:18.135178 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5ntx" event={"ID":"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12","Type":"ContainerStarted","Data":"2922f3010d53b8276cd0acbf5474f41ef95277b56e51014570b310a065808d65"} Mar 10 22:46:19 crc kubenswrapper[4919]: I0310 22:46:19.144566 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5ntx" event={"ID":"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12","Type":"ContainerStarted","Data":"5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb"} Mar 10 22:46:20 crc kubenswrapper[4919]: I0310 22:46:20.157328 4919 generic.go:334] "Generic (PLEG): container finished" podID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerID="5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb" exitCode=0 Mar 10 22:46:20 crc kubenswrapper[4919]: I0310 22:46:20.157394 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5ntx" event={"ID":"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12","Type":"ContainerDied","Data":"5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb"} Mar 10 22:46:21 crc kubenswrapper[4919]: I0310 22:46:21.167123 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5ntx" event={"ID":"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12","Type":"ContainerStarted","Data":"b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb"} Mar 10 22:46:21 crc kubenswrapper[4919]: I0310 22:46:21.185811 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j5ntx" podStartSLOduration=2.658197613 podStartE2EDuration="5.185791239s" podCreationTimestamp="2026-03-10 22:46:16 +0000 UTC" firstStartedPulling="2026-03-10 22:46:18.13683803 +0000 UTC m=+3365.378718628" lastFinishedPulling="2026-03-10 22:46:20.664431646 +0000 UTC m=+3367.906312254" observedRunningTime="2026-03-10 22:46:21.183351842 +0000 UTC m=+3368.425232470" watchObservedRunningTime="2026-03-10 22:46:21.185791239 +0000 UTC m=+3368.427671857" Mar 10 22:46:27 crc kubenswrapper[4919]: I0310 22:46:27.012058 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:27 crc kubenswrapper[4919]: I0310 22:46:27.012681 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:27 crc kubenswrapper[4919]: I0310 22:46:27.055496 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:27 crc kubenswrapper[4919]: I0310 22:46:27.272843 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:27 crc kubenswrapper[4919]: I0310 22:46:27.319833 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5ntx"] Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.175363 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.175808 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.226773 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j5ntx" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerName="registry-server" containerID="cri-o://b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb" gracePeriod=2 Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.643816 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.775730 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-524fm\" (UniqueName: \"kubernetes.io/projected/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-kube-api-access-524fm\") pod \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.775795 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-utilities\") pod \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.775940 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-catalog-content\") pod \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\" (UID: \"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12\") " Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.777599 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-utilities" (OuterVolumeSpecName: "utilities") pod "dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" (UID: "dd3ff50a-8fdb-4dd6-9628-ec3fababbf12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.781302 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-kube-api-access-524fm" (OuterVolumeSpecName: "kube-api-access-524fm") pod "dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" (UID: "dd3ff50a-8fdb-4dd6-9628-ec3fababbf12"). InnerVolumeSpecName "kube-api-access-524fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.878209 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-524fm\" (UniqueName: \"kubernetes.io/projected/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-kube-api-access-524fm\") on node \"crc\" DevicePath \"\"" Mar 10 22:46:29 crc kubenswrapper[4919]: I0310 22:46:29.878257 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.239826 4919 generic.go:334] "Generic (PLEG): container finished" podID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerID="b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb" exitCode=0 Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.239904 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5ntx" event={"ID":"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12","Type":"ContainerDied","Data":"b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb"} Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.239967 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5ntx" event={"ID":"dd3ff50a-8fdb-4dd6-9628-ec3fababbf12","Type":"ContainerDied","Data":"2922f3010d53b8276cd0acbf5474f41ef95277b56e51014570b310a065808d65"} Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.239994 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5ntx" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.240005 4919 scope.go:117] "RemoveContainer" containerID="b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.266988 4919 scope.go:117] "RemoveContainer" containerID="5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.290190 4919 scope.go:117] "RemoveContainer" containerID="a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.332628 4919 scope.go:117] "RemoveContainer" containerID="b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb" Mar 10 22:46:30 crc kubenswrapper[4919]: E0310 22:46:30.333194 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb\": container with ID starting with b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb not found: ID does not exist" containerID="b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.333258 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb"} err="failed to get container status \"b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb\": rpc error: code = NotFound desc = could not find container \"b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb\": container with ID starting with b1d25b89650396f07e3218762cb5c44eb7cd45dbfe7ba129e1e9ba18ee9dfffb not found: ID does not exist" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.333300 4919 scope.go:117] "RemoveContainer" containerID="5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb" Mar 10 22:46:30 crc kubenswrapper[4919]: E0310 22:46:30.333946 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb\": container with ID starting with 5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb not found: ID does not exist" containerID="5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.333989 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb"} err="failed to get container status \"5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb\": rpc error: code = NotFound desc = could not find container \"5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb\": container with ID starting with 5c5ab6a17fcc897e4409f5250efa4684a7350ac7f0b5ccd7062465d5dd8caedb not found: ID does not exist" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.334050 4919 scope.go:117] "RemoveContainer" containerID="a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c" Mar 10 22:46:30 crc kubenswrapper[4919]: E0310 22:46:30.334437 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c\": container with ID starting with a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c not found: ID does not exist" containerID="a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c" Mar 10 22:46:30 crc kubenswrapper[4919]: I0310 22:46:30.334492 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c"} err="failed to get container status \"a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c\": rpc error: code = NotFound desc = could not find container \"a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c\": container with ID starting with a037844f7e2683c2f7bc3eefdb7c8f4b6ce3725e962770ce4887ae0424745f3c not found: ID does not exist" Mar 10 22:46:31 crc kubenswrapper[4919]: I0310 22:46:31.419330 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" (UID: "dd3ff50a-8fdb-4dd6-9628-ec3fababbf12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:46:31 crc kubenswrapper[4919]: I0310 22:46:31.474184 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5ntx"] Mar 10 22:46:31 crc kubenswrapper[4919]: I0310 22:46:31.489490 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j5ntx"] Mar 10 22:46:31 crc kubenswrapper[4919]: I0310 22:46:31.503424 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:46:33 crc kubenswrapper[4919]: I0310 22:46:33.487681 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" path="/var/lib/kubelet/pods/dd3ff50a-8fdb-4dd6-9628-ec3fababbf12/volumes" Mar 10 22:46:44 crc kubenswrapper[4919]: I0310 22:46:44.862965 4919 scope.go:117] "RemoveContainer" containerID="1462698ac8917a6a6ccf8b94b11c5d10de658a9672ad20caeb438544c249ab38" Mar 10 22:46:59 crc kubenswrapper[4919]: I0310 22:46:59.175874 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:46:59 crc kubenswrapper[4919]: I0310 22:46:59.176422 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.176560 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.177105 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.177193 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.177839 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"372471d7ebc7335110031f4e58477a54f513fd65aed93d36e38c44ab11d01d23"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.177905 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://372471d7ebc7335110031f4e58477a54f513fd65aed93d36e38c44ab11d01d23" gracePeriod=600 Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.696568 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="372471d7ebc7335110031f4e58477a54f513fd65aed93d36e38c44ab11d01d23" exitCode=0 Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.696741 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"372471d7ebc7335110031f4e58477a54f513fd65aed93d36e38c44ab11d01d23"} Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.697157 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2"} Mar 10 22:47:29 crc kubenswrapper[4919]: I0310 22:47:29.697186 4919 scope.go:117] "RemoveContainer" containerID="cf9c15cde0a4a7044aba38580ba691c219a144d0823bd8e43395fedfbd05ea94" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.148225 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553048-f57pj"] Mar 10 22:48:00 crc kubenswrapper[4919]: E0310 22:48:00.149050 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerName="extract-utilities" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.149065 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerName="extract-utilities" Mar 10 22:48:00 crc kubenswrapper[4919]: E0310 22:48:00.149082 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerName="registry-server" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.149090 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerName="registry-server" Mar 10 22:48:00 crc kubenswrapper[4919]: E0310 22:48:00.149104 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerName="extract-content" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.149111 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerName="extract-content" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.149305 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3ff50a-8fdb-4dd6-9628-ec3fababbf12" containerName="registry-server" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.149850 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553048-f57pj" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.151909 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.152468 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.153476 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.155649 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553048-f57pj"] Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.317127 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6b5\" (UniqueName: \"kubernetes.io/projected/e83298b5-dee4-4e12-aed2-011c8a6e2b30-kube-api-access-6p6b5\") pod \"auto-csr-approver-29553048-f57pj\" (UID: \"e83298b5-dee4-4e12-aed2-011c8a6e2b30\") " pod="openshift-infra/auto-csr-approver-29553048-f57pj" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.418286 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6b5\" (UniqueName: \"kubernetes.io/projected/e83298b5-dee4-4e12-aed2-011c8a6e2b30-kube-api-access-6p6b5\") pod \"auto-csr-approver-29553048-f57pj\" (UID: \"e83298b5-dee4-4e12-aed2-011c8a6e2b30\") " pod="openshift-infra/auto-csr-approver-29553048-f57pj" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.437229 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6b5\" (UniqueName: \"kubernetes.io/projected/e83298b5-dee4-4e12-aed2-011c8a6e2b30-kube-api-access-6p6b5\") pod \"auto-csr-approver-29553048-f57pj\" (UID: \"e83298b5-dee4-4e12-aed2-011c8a6e2b30\") " pod="openshift-infra/auto-csr-approver-29553048-f57pj" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.473029 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553048-f57pj" Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.899213 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553048-f57pj"] Mar 10 22:48:00 crc kubenswrapper[4919]: I0310 22:48:00.960440 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553048-f57pj" event={"ID":"e83298b5-dee4-4e12-aed2-011c8a6e2b30","Type":"ContainerStarted","Data":"ab04d59641a8eed0c14b46141e146d3cde6732e247d27be55733c4dea7cb95aa"} Mar 10 22:48:02 crc kubenswrapper[4919]: I0310 22:48:02.974160 4919 generic.go:334] "Generic (PLEG): container finished" podID="e83298b5-dee4-4e12-aed2-011c8a6e2b30" containerID="af9836be971ca09598235169839f6617eb1d2ad9f6d9f052df699df6ef0bcfe0" exitCode=0 Mar 10 22:48:02 crc kubenswrapper[4919]: I0310 22:48:02.974245 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553048-f57pj" event={"ID":"e83298b5-dee4-4e12-aed2-011c8a6e2b30","Type":"ContainerDied","Data":"af9836be971ca09598235169839f6617eb1d2ad9f6d9f052df699df6ef0bcfe0"} Mar 10 22:48:04 crc kubenswrapper[4919]: I0310 22:48:04.287518 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553048-f57pj" Mar 10 22:48:04 crc kubenswrapper[4919]: I0310 22:48:04.373112 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p6b5\" (UniqueName: \"kubernetes.io/projected/e83298b5-dee4-4e12-aed2-011c8a6e2b30-kube-api-access-6p6b5\") pod \"e83298b5-dee4-4e12-aed2-011c8a6e2b30\" (UID: \"e83298b5-dee4-4e12-aed2-011c8a6e2b30\") " Mar 10 22:48:04 crc kubenswrapper[4919]: I0310 22:48:04.380258 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83298b5-dee4-4e12-aed2-011c8a6e2b30-kube-api-access-6p6b5" (OuterVolumeSpecName: "kube-api-access-6p6b5") pod "e83298b5-dee4-4e12-aed2-011c8a6e2b30" (UID: "e83298b5-dee4-4e12-aed2-011c8a6e2b30"). InnerVolumeSpecName "kube-api-access-6p6b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:48:04 crc kubenswrapper[4919]: I0310 22:48:04.475040 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p6b5\" (UniqueName: \"kubernetes.io/projected/e83298b5-dee4-4e12-aed2-011c8a6e2b30-kube-api-access-6p6b5\") on node \"crc\" DevicePath \"\"" Mar 10 22:48:04 crc kubenswrapper[4919]: I0310 22:48:04.989257 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553048-f57pj" event={"ID":"e83298b5-dee4-4e12-aed2-011c8a6e2b30","Type":"ContainerDied","Data":"ab04d59641a8eed0c14b46141e146d3cde6732e247d27be55733c4dea7cb95aa"} Mar 10 22:48:04 crc kubenswrapper[4919]: I0310 22:48:04.989918 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab04d59641a8eed0c14b46141e146d3cde6732e247d27be55733c4dea7cb95aa" Mar 10 22:48:04 crc kubenswrapper[4919]: I0310 22:48:04.989307 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553048-f57pj" Mar 10 22:48:05 crc kubenswrapper[4919]: I0310 22:48:05.362813 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553042-zfjds"] Mar 10 22:48:05 crc kubenswrapper[4919]: I0310 22:48:05.370051 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553042-zfjds"] Mar 10 22:48:05 crc kubenswrapper[4919]: I0310 22:48:05.487663 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38361e86-2631-471b-aee4-b0b0b8da613b" path="/var/lib/kubelet/pods/38361e86-2631-471b-aee4-b0b0b8da613b/volumes" Mar 10 22:48:44 crc kubenswrapper[4919]: I0310 22:48:44.964194 4919 scope.go:117] "RemoveContainer" containerID="e42bba54a0285355ef71ccbd62ef4211da3ffda3eab8d1dc9eee80d63c3e59ea" Mar 10 22:49:09 crc kubenswrapper[4919]: I0310 22:49:09.969329 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-88ptd"] Mar 10 22:49:09 crc kubenswrapper[4919]: E0310 22:49:09.970340 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83298b5-dee4-4e12-aed2-011c8a6e2b30" containerName="oc" Mar 10 22:49:09 crc kubenswrapper[4919]: I0310 22:49:09.970356 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83298b5-dee4-4e12-aed2-011c8a6e2b30" containerName="oc" Mar 10 22:49:09 crc kubenswrapper[4919]: I0310 22:49:09.970608 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83298b5-dee4-4e12-aed2-011c8a6e2b30" containerName="oc" Mar 10 22:49:09 crc kubenswrapper[4919]: I0310 22:49:09.973017 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:09 crc kubenswrapper[4919]: I0310 22:49:09.983060 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-88ptd"] Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.026046 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-utilities\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.026095 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-catalog-content\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.026126 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q4jn\" (UniqueName: \"kubernetes.io/projected/818a58d1-fa9a-4924-a367-fd82018ca7b9-kube-api-access-8q4jn\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.128068 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q4jn\" (UniqueName: \"kubernetes.io/projected/818a58d1-fa9a-4924-a367-fd82018ca7b9-kube-api-access-8q4jn\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.128488 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-utilities\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.128520 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-catalog-content\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.128952 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-catalog-content\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.129018 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-utilities\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.148293 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q4jn\" (UniqueName: \"kubernetes.io/projected/818a58d1-fa9a-4924-a367-fd82018ca7b9-kube-api-access-8q4jn\") pod \"certified-operators-88ptd\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.306707 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:10 crc kubenswrapper[4919]: I0310 22:49:10.555617 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-88ptd"] Mar 10 22:49:11 crc kubenswrapper[4919]: I0310 22:49:11.505805 4919 generic.go:334] "Generic (PLEG): container finished" podID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerID="8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7" exitCode=0 Mar 10 22:49:11 crc kubenswrapper[4919]: I0310 22:49:11.505848 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88ptd" event={"ID":"818a58d1-fa9a-4924-a367-fd82018ca7b9","Type":"ContainerDied","Data":"8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7"} Mar 10 22:49:11 crc kubenswrapper[4919]: I0310 22:49:11.505875 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88ptd" event={"ID":"818a58d1-fa9a-4924-a367-fd82018ca7b9","Type":"ContainerStarted","Data":"b781297cd65549ce20720f5bf9aa4855a9bc13d758aa9df75bc7ac3193b7bf4b"} Mar 10 22:49:12 crc kubenswrapper[4919]: I0310 22:49:12.521283 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88ptd" event={"ID":"818a58d1-fa9a-4924-a367-fd82018ca7b9","Type":"ContainerStarted","Data":"78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878"} Mar 10 22:49:13 crc kubenswrapper[4919]: I0310 22:49:13.529718 4919 generic.go:334] "Generic (PLEG): container finished" podID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerID="78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878" exitCode=0 Mar 10 22:49:13 crc kubenswrapper[4919]: I0310 22:49:13.529778 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88ptd" event={"ID":"818a58d1-fa9a-4924-a367-fd82018ca7b9","Type":"ContainerDied","Data":"78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878"} Mar 10 22:49:14 crc kubenswrapper[4919]: I0310 22:49:14.540148 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88ptd" event={"ID":"818a58d1-fa9a-4924-a367-fd82018ca7b9","Type":"ContainerStarted","Data":"b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689"} Mar 10 22:49:14 crc kubenswrapper[4919]: I0310 22:49:14.563133 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-88ptd" podStartSLOduration=3.012058471 podStartE2EDuration="5.563115478s" podCreationTimestamp="2026-03-10 22:49:09 +0000 UTC" firstStartedPulling="2026-03-10 22:49:11.507749935 +0000 UTC m=+3538.749630543" lastFinishedPulling="2026-03-10 22:49:14.058806942 +0000 UTC m=+3541.300687550" observedRunningTime="2026-03-10 22:49:14.560223389 +0000 UTC m=+3541.802104027" watchObservedRunningTime="2026-03-10 22:49:14.563115478 +0000 UTC m=+3541.804996096" Mar 10 22:49:20 crc kubenswrapper[4919]: I0310 22:49:20.307345 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:20 crc kubenswrapper[4919]: I0310 22:49:20.307924 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:20 crc kubenswrapper[4919]: I0310 22:49:20.352503 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:20 crc kubenswrapper[4919]: I0310 22:49:20.635442 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:20 crc kubenswrapper[4919]: I0310 22:49:20.685319 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-88ptd"] Mar 10 22:49:22 crc kubenswrapper[4919]: I0310 22:49:22.605820 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-88ptd" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerName="registry-server" containerID="cri-o://b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689" gracePeriod=2 Mar 10 22:49:22 crc kubenswrapper[4919]: I0310 22:49:22.996504 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.116020 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-catalog-content\") pod \"818a58d1-fa9a-4924-a367-fd82018ca7b9\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.116174 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-utilities\") pod \"818a58d1-fa9a-4924-a367-fd82018ca7b9\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.116205 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q4jn\" (UniqueName: \"kubernetes.io/projected/818a58d1-fa9a-4924-a367-fd82018ca7b9-kube-api-access-8q4jn\") pod \"818a58d1-fa9a-4924-a367-fd82018ca7b9\" (UID: \"818a58d1-fa9a-4924-a367-fd82018ca7b9\") " Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.117129 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-utilities" (OuterVolumeSpecName: "utilities") pod "818a58d1-fa9a-4924-a367-fd82018ca7b9" (UID: "818a58d1-fa9a-4924-a367-fd82018ca7b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.125721 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818a58d1-fa9a-4924-a367-fd82018ca7b9-kube-api-access-8q4jn" (OuterVolumeSpecName: "kube-api-access-8q4jn") pod "818a58d1-fa9a-4924-a367-fd82018ca7b9" (UID: "818a58d1-fa9a-4924-a367-fd82018ca7b9"). InnerVolumeSpecName "kube-api-access-8q4jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.217807 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.217849 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q4jn\" (UniqueName: \"kubernetes.io/projected/818a58d1-fa9a-4924-a367-fd82018ca7b9-kube-api-access-8q4jn\") on node \"crc\" DevicePath \"\"" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.384866 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "818a58d1-fa9a-4924-a367-fd82018ca7b9" (UID: "818a58d1-fa9a-4924-a367-fd82018ca7b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.421099 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818a58d1-fa9a-4924-a367-fd82018ca7b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.619704 4919 generic.go:334] "Generic (PLEG): container finished" podID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerID="b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689" exitCode=0 Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.619770 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88ptd" event={"ID":"818a58d1-fa9a-4924-a367-fd82018ca7b9","Type":"ContainerDied","Data":"b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689"} Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.619850 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88ptd" event={"ID":"818a58d1-fa9a-4924-a367-fd82018ca7b9","Type":"ContainerDied","Data":"b781297cd65549ce20720f5bf9aa4855a9bc13d758aa9df75bc7ac3193b7bf4b"} Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.619885 4919 scope.go:117] "RemoveContainer" containerID="b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.619788 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88ptd" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.644548 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-88ptd"] Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.646112 4919 scope.go:117] "RemoveContainer" containerID="78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.650478 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-88ptd"] Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.664319 4919 scope.go:117] "RemoveContainer" containerID="8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.689050 4919 scope.go:117] "RemoveContainer" containerID="b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689" Mar 10 22:49:23 crc kubenswrapper[4919]: E0310 22:49:23.689490 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689\": container with ID starting with b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689 not found: ID does not exist" containerID="b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.689535 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689"} err="failed to get container status \"b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689\": rpc error: code = NotFound desc = could not find container \"b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689\": container with ID starting with b3c89deae5bafd3224e3e41f2a8bb1eeae6273c35bc55f0a8d80ccb1f2650689 not found: ID does not exist" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.689564 4919 scope.go:117] "RemoveContainer" containerID="78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878" Mar 10 22:49:23 crc kubenswrapper[4919]: E0310 22:49:23.690097 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878\": container with ID starting with 78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878 not found: ID does not exist" containerID="78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.690164 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878"} err="failed to get container status \"78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878\": rpc error: code = NotFound desc = could not find container \"78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878\": container with ID starting with 78bd7679f063f1f17c86e7fb5cf8bf67f28a1cfb5fd0ade77b0616c7a98fa878 not found: ID does not exist" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.690192 4919 scope.go:117] "RemoveContainer" containerID="8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7" Mar 10 22:49:23 crc kubenswrapper[4919]: E0310 22:49:23.690612 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7\": container with ID starting with 8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7 not found: ID does not exist" containerID="8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7" Mar 10 22:49:23 crc kubenswrapper[4919]: I0310 22:49:23.690642 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7"} err="failed to get container status \"8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7\": rpc error: code = NotFound desc = could not find container \"8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7\": container with ID starting with 8fcf2302e4ce0db9889f5ae2974289301a54f8377ab257e4af8480e4527d05f7 not found: ID does not exist" Mar 10 22:49:25 crc kubenswrapper[4919]: I0310 22:49:25.491123 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" path="/var/lib/kubelet/pods/818a58d1-fa9a-4924-a367-fd82018ca7b9/volumes" Mar 10 22:49:29 crc kubenswrapper[4919]: I0310 22:49:29.175981 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:49:29 crc kubenswrapper[4919]: I0310 22:49:29.176490 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:49:59 crc kubenswrapper[4919]: I0310 22:49:59.175757 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:49:59 crc kubenswrapper[4919]: I0310 22:49:59.176688 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.148899 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553050-mq9qr"] Mar 10 22:50:00 crc kubenswrapper[4919]: E0310 22:50:00.149708 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerName="registry-server" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.149732 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerName="registry-server" Mar 10 22:50:00 crc kubenswrapper[4919]: E0310 22:50:00.149752 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerName="extract-utilities" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.149761 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerName="extract-utilities" Mar 10 22:50:00 crc kubenswrapper[4919]: E0310 22:50:00.149779 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerName="extract-content" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.149786 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerName="extract-content" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.150022 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="818a58d1-fa9a-4924-a367-fd82018ca7b9" containerName="registry-server" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.150708 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553050-mq9qr" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.153459 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.153698 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.153624 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.155070 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553050-mq9qr"] Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.232924 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7hr6\" (UniqueName: \"kubernetes.io/projected/eb6d7798-e079-471b-92ee-b895caeff582-kube-api-access-j7hr6\") pod \"auto-csr-approver-29553050-mq9qr\" (UID: \"eb6d7798-e079-471b-92ee-b895caeff582\") " pod="openshift-infra/auto-csr-approver-29553050-mq9qr" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.334703 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7hr6\" (UniqueName: \"kubernetes.io/projected/eb6d7798-e079-471b-92ee-b895caeff582-kube-api-access-j7hr6\") pod \"auto-csr-approver-29553050-mq9qr\" (UID: \"eb6d7798-e079-471b-92ee-b895caeff582\") " pod="openshift-infra/auto-csr-approver-29553050-mq9qr" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.354201 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7hr6\" (UniqueName: \"kubernetes.io/projected/eb6d7798-e079-471b-92ee-b895caeff582-kube-api-access-j7hr6\") pod \"auto-csr-approver-29553050-mq9qr\" (UID: \"eb6d7798-e079-471b-92ee-b895caeff582\") " pod="openshift-infra/auto-csr-approver-29553050-mq9qr" Mar 10 22:50:00 crc kubenswrapper[4919]: I0310 22:50:00.470273 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553050-mq9qr" Mar 10 22:50:01 crc kubenswrapper[4919]: I0310 22:50:01.514998 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553050-mq9qr"] Mar 10 22:50:01 crc kubenswrapper[4919]: I0310 22:50:01.909094 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553050-mq9qr" event={"ID":"eb6d7798-e079-471b-92ee-b895caeff582","Type":"ContainerStarted","Data":"2d1ac71c581851ac98682f9848b7a49b32be9cc2e02d4fbe21dba749036308b8"} Mar 10 22:50:03 crc kubenswrapper[4919]: I0310 22:50:03.925709 4919 generic.go:334] "Generic (PLEG): container finished" podID="eb6d7798-e079-471b-92ee-b895caeff582" containerID="5604fcdf3e68725135d51ad90f25ecd86367e7b42b3f0ff7c4087345c87db9d7" exitCode=0 Mar 10 22:50:03 crc kubenswrapper[4919]: I0310 22:50:03.925804 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553050-mq9qr" event={"ID":"eb6d7798-e079-471b-92ee-b895caeff582","Type":"ContainerDied","Data":"5604fcdf3e68725135d51ad90f25ecd86367e7b42b3f0ff7c4087345c87db9d7"} Mar 10 22:50:05 crc kubenswrapper[4919]: I0310 22:50:05.198957 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553050-mq9qr" Mar 10 22:50:05 crc kubenswrapper[4919]: I0310 22:50:05.304044 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7hr6\" (UniqueName: \"kubernetes.io/projected/eb6d7798-e079-471b-92ee-b895caeff582-kube-api-access-j7hr6\") pod \"eb6d7798-e079-471b-92ee-b895caeff582\" (UID: \"eb6d7798-e079-471b-92ee-b895caeff582\") " Mar 10 22:50:05 crc kubenswrapper[4919]: I0310 22:50:05.311405 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6d7798-e079-471b-92ee-b895caeff582-kube-api-access-j7hr6" (OuterVolumeSpecName: "kube-api-access-j7hr6") pod "eb6d7798-e079-471b-92ee-b895caeff582" (UID: "eb6d7798-e079-471b-92ee-b895caeff582"). InnerVolumeSpecName "kube-api-access-j7hr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:50:05 crc kubenswrapper[4919]: I0310 22:50:05.405899 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7hr6\" (UniqueName: \"kubernetes.io/projected/eb6d7798-e079-471b-92ee-b895caeff582-kube-api-access-j7hr6\") on node \"crc\" DevicePath \"\"" Mar 10 22:50:05 crc kubenswrapper[4919]: I0310 22:50:05.959422 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553050-mq9qr" event={"ID":"eb6d7798-e079-471b-92ee-b895caeff582","Type":"ContainerDied","Data":"2d1ac71c581851ac98682f9848b7a49b32be9cc2e02d4fbe21dba749036308b8"} Mar 10 22:50:05 crc kubenswrapper[4919]: I0310 22:50:05.959462 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553050-mq9qr" Mar 10 22:50:05 crc kubenswrapper[4919]: I0310 22:50:05.959472 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d1ac71c581851ac98682f9848b7a49b32be9cc2e02d4fbe21dba749036308b8" Mar 10 22:50:06 crc kubenswrapper[4919]: I0310 22:50:06.272139 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553044-rddsc"] Mar 10 22:50:06 crc kubenswrapper[4919]: I0310 22:50:06.276978 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553044-rddsc"] Mar 10 22:50:07 crc kubenswrapper[4919]: I0310 22:50:07.494738 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69bff7cf-85ff-450d-a7e4-5b2236343394" path="/var/lib/kubelet/pods/69bff7cf-85ff-450d-a7e4-5b2236343394/volumes" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.775803 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h8hgh"] Mar 10 22:50:19 crc kubenswrapper[4919]: E0310 22:50:19.776743 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6d7798-e079-471b-92ee-b895caeff582" containerName="oc" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.776760 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6d7798-e079-471b-92ee-b895caeff582" containerName="oc" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.777013 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6d7798-e079-471b-92ee-b895caeff582" containerName="oc" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.778354 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.793192 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8hgh"] Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.810480 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh9lg\" (UniqueName: \"kubernetes.io/projected/82748538-5bfe-448f-ba19-c9b8d61fe7bf-kube-api-access-kh9lg\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.810649 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-catalog-content\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.810820 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-utilities\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.912555 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-utilities\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.912626 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh9lg\" (UniqueName: \"kubernetes.io/projected/82748538-5bfe-448f-ba19-c9b8d61fe7bf-kube-api-access-kh9lg\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.912669 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-catalog-content\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.913083 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-catalog-content\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.913477 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-utilities\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:19 crc kubenswrapper[4919]: I0310 22:50:19.933285 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh9lg\" (UniqueName: \"kubernetes.io/projected/82748538-5bfe-448f-ba19-c9b8d61fe7bf-kube-api-access-kh9lg\") pod \"redhat-marketplace-h8hgh\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:20 crc kubenswrapper[4919]: I0310 22:50:20.103774 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:20 crc kubenswrapper[4919]: I0310 22:50:20.596236 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8hgh"] Mar 10 22:50:21 crc kubenswrapper[4919]: I0310 22:50:21.082264 4919 generic.go:334] "Generic (PLEG): container finished" podID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerID="77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01" exitCode=0 Mar 10 22:50:21 crc kubenswrapper[4919]: I0310 22:50:21.082309 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8hgh" event={"ID":"82748538-5bfe-448f-ba19-c9b8d61fe7bf","Type":"ContainerDied","Data":"77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01"} Mar 10 22:50:21 crc kubenswrapper[4919]: I0310 22:50:21.082606 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8hgh" event={"ID":"82748538-5bfe-448f-ba19-c9b8d61fe7bf","Type":"ContainerStarted","Data":"d54f13ec900b5620f6b7c70f960f53bb2f6356846ba88c790cfb1b047e47cebb"} Mar 10 22:50:22 crc kubenswrapper[4919]: I0310 22:50:22.092492 4919 generic.go:334] "Generic (PLEG): container finished" podID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerID="35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36" exitCode=0 Mar 10 22:50:22 crc kubenswrapper[4919]: I0310 22:50:22.092549 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8hgh" event={"ID":"82748538-5bfe-448f-ba19-c9b8d61fe7bf","Type":"ContainerDied","Data":"35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36"} Mar 10 22:50:23 crc kubenswrapper[4919]: I0310 22:50:23.107660 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8hgh" event={"ID":"82748538-5bfe-448f-ba19-c9b8d61fe7bf","Type":"ContainerStarted","Data":"c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde"} Mar 10 22:50:23 crc kubenswrapper[4919]: I0310 22:50:23.139126 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h8hgh" podStartSLOduration=2.722563878 podStartE2EDuration="4.139100437s" podCreationTimestamp="2026-03-10 22:50:19 +0000 UTC" firstStartedPulling="2026-03-10 22:50:21.084008787 +0000 UTC m=+3608.325889395" lastFinishedPulling="2026-03-10 22:50:22.500545316 +0000 UTC m=+3609.742425954" observedRunningTime="2026-03-10 22:50:23.131010185 +0000 UTC m=+3610.372890813" watchObservedRunningTime="2026-03-10 22:50:23.139100437 +0000 UTC m=+3610.380981085" Mar 10 22:50:25 crc kubenswrapper[4919]: I0310 22:50:25.956016 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xml7f"] Mar 10 22:50:25 crc kubenswrapper[4919]: I0310 22:50:25.958982 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:25 crc kubenswrapper[4919]: I0310 22:50:25.975933 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xml7f"] Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.090781 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-utilities\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.090846 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p777\" (UniqueName: \"kubernetes.io/projected/1c0a2f43-f824-41dd-a701-71426bb01d90-kube-api-access-5p777\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.090891 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-catalog-content\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.193288 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-utilities\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.193366 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p777\" (UniqueName: \"kubernetes.io/projected/1c0a2f43-f824-41dd-a701-71426bb01d90-kube-api-access-5p777\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.193433 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-catalog-content\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.193791 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-utilities\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.193901 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-catalog-content\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.218695 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p777\" (UniqueName: \"kubernetes.io/projected/1c0a2f43-f824-41dd-a701-71426bb01d90-kube-api-access-5p777\") pod \"community-operators-xml7f\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.302458 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:26 crc kubenswrapper[4919]: I0310 22:50:26.733232 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xml7f"] Mar 10 22:50:26 crc kubenswrapper[4919]: W0310 22:50:26.735697 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c0a2f43_f824_41dd_a701_71426bb01d90.slice/crio-b676745e46b648beb58ee91d2cf3927cb37d2532040058da9dd67a33ad82e2d5 WatchSource:0}: Error finding container b676745e46b648beb58ee91d2cf3927cb37d2532040058da9dd67a33ad82e2d5: Status 404 returned error can't find the container with id b676745e46b648beb58ee91d2cf3927cb37d2532040058da9dd67a33ad82e2d5 Mar 10 22:50:27 crc kubenswrapper[4919]: I0310 22:50:27.141581 4919 generic.go:334] "Generic (PLEG): container finished" podID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerID="df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d" exitCode=0 Mar 10 22:50:27 crc kubenswrapper[4919]: I0310 22:50:27.141622 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xml7f" event={"ID":"1c0a2f43-f824-41dd-a701-71426bb01d90","Type":"ContainerDied","Data":"df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d"} Mar 10 22:50:27 crc kubenswrapper[4919]: I0310 22:50:27.141645 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xml7f" event={"ID":"1c0a2f43-f824-41dd-a701-71426bb01d90","Type":"ContainerStarted","Data":"b676745e46b648beb58ee91d2cf3927cb37d2532040058da9dd67a33ad82e2d5"} Mar 10 22:50:28 crc kubenswrapper[4919]: I0310 22:50:28.150132 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xml7f" event={"ID":"1c0a2f43-f824-41dd-a701-71426bb01d90","Type":"ContainerStarted","Data":"0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667"} Mar 10 22:50:29 crc kubenswrapper[4919]: I0310 22:50:29.160891 4919 generic.go:334] "Generic (PLEG): container finished" podID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerID="0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667" exitCode=0 Mar 10 22:50:29 crc kubenswrapper[4919]: I0310 22:50:29.160935 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xml7f" event={"ID":"1c0a2f43-f824-41dd-a701-71426bb01d90","Type":"ContainerDied","Data":"0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667"} Mar 10 22:50:29 crc kubenswrapper[4919]: I0310 22:50:29.175663 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:50:29 crc kubenswrapper[4919]: I0310 22:50:29.175717 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:50:29 crc kubenswrapper[4919]: I0310 22:50:29.175762 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:50:29 crc kubenswrapper[4919]: I0310 22:50:29.176448 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:50:29 crc kubenswrapper[4919]: I0310 22:50:29.176511 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" gracePeriod=600 Mar 10 22:50:29 crc kubenswrapper[4919]: E0310 22:50:29.298571 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.104018 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.104274 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.146296 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.181817 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" exitCode=0 Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.181904 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2"} Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.181960 4919 scope.go:117] "RemoveContainer" containerID="372471d7ebc7335110031f4e58477a54f513fd65aed93d36e38c44ab11d01d23" Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.182599 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:50:30 crc kubenswrapper[4919]: E0310 22:50:30.182935 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.185363 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xml7f" event={"ID":"1c0a2f43-f824-41dd-a701-71426bb01d90","Type":"ContainerStarted","Data":"59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea"} Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.237889 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xml7f" podStartSLOduration=2.710749511 podStartE2EDuration="5.237868706s" podCreationTimestamp="2026-03-10 22:50:25 +0000 UTC" firstStartedPulling="2026-03-10 22:50:27.142805047 +0000 UTC m=+3614.384685655" lastFinishedPulling="2026-03-10 22:50:29.669924242 +0000 UTC m=+3616.911804850" observedRunningTime="2026-03-10 22:50:30.234048021 +0000 UTC m=+3617.475928629" watchObservedRunningTime="2026-03-10 22:50:30.237868706 +0000 UTC m=+3617.479749314" Mar 10 22:50:30 crc kubenswrapper[4919]: I0310 22:50:30.247811 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:31 crc kubenswrapper[4919]: I0310 22:50:31.937109 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8hgh"] Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.201116 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h8hgh" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerName="registry-server" containerID="cri-o://c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde" gracePeriod=2 Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.602173 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.781963 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-utilities\") pod \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.782193 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-catalog-content\") pod \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.782280 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh9lg\" (UniqueName: \"kubernetes.io/projected/82748538-5bfe-448f-ba19-c9b8d61fe7bf-kube-api-access-kh9lg\") pod \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\" (UID: \"82748538-5bfe-448f-ba19-c9b8d61fe7bf\") " Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.783213 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-utilities" (OuterVolumeSpecName: "utilities") pod "82748538-5bfe-448f-ba19-c9b8d61fe7bf" (UID: "82748538-5bfe-448f-ba19-c9b8d61fe7bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.787453 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82748538-5bfe-448f-ba19-c9b8d61fe7bf-kube-api-access-kh9lg" (OuterVolumeSpecName: "kube-api-access-kh9lg") pod "82748538-5bfe-448f-ba19-c9b8d61fe7bf" (UID: "82748538-5bfe-448f-ba19-c9b8d61fe7bf"). InnerVolumeSpecName "kube-api-access-kh9lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.810621 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82748538-5bfe-448f-ba19-c9b8d61fe7bf" (UID: "82748538-5bfe-448f-ba19-c9b8d61fe7bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.883688 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.883726 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh9lg\" (UniqueName: \"kubernetes.io/projected/82748538-5bfe-448f-ba19-c9b8d61fe7bf-kube-api-access-kh9lg\") on node \"crc\" DevicePath \"\"" Mar 10 22:50:32 crc kubenswrapper[4919]: I0310 22:50:32.883740 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82748538-5bfe-448f-ba19-c9b8d61fe7bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.212982 4919 generic.go:334] "Generic (PLEG): container finished" podID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerID="c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde" exitCode=0 Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.213029 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8hgh" event={"ID":"82748538-5bfe-448f-ba19-c9b8d61fe7bf","Type":"ContainerDied","Data":"c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde"} Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.213064 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8hgh" event={"ID":"82748538-5bfe-448f-ba19-c9b8d61fe7bf","Type":"ContainerDied","Data":"d54f13ec900b5620f6b7c70f960f53bb2f6356846ba88c790cfb1b047e47cebb"} Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.213076 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8hgh" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.213082 4919 scope.go:117] "RemoveContainer" containerID="c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.231590 4919 scope.go:117] "RemoveContainer" containerID="35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.261470 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8hgh"] Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.266686 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8hgh"] Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.276384 4919 scope.go:117] "RemoveContainer" containerID="77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.294133 4919 scope.go:117] "RemoveContainer" containerID="c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde" Mar 10 22:50:33 crc kubenswrapper[4919]: E0310 22:50:33.294662 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde\": container with ID starting with c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde not found: ID does not exist" containerID="c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.294718 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde"} err="failed to get container status \"c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde\": rpc error: code = NotFound desc = could not find container \"c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde\": container with ID starting with c075aa4dbdd2556625e3ae3450e119becd3f6543914652ecd588824ad9abbcde not found: ID does not exist" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.294752 4919 scope.go:117] "RemoveContainer" containerID="35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36" Mar 10 22:50:33 crc kubenswrapper[4919]: E0310 22:50:33.295136 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36\": container with ID starting with 35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36 not found: ID does not exist" containerID="35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.295173 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36"} err="failed to get container status \"35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36\": rpc error: code = NotFound desc = could not find container \"35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36\": container with ID starting with 35993f1adaed83dc4461c2332b113f5df8df76025a4b3c72ed52f6871b9b5b36 not found: ID does not exist" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.295193 4919 scope.go:117] "RemoveContainer" containerID="77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01" Mar 10 22:50:33 crc kubenswrapper[4919]: E0310 22:50:33.295580 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01\": container with ID starting with 77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01 not found: ID does not exist" containerID="77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.295606 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01"} err="failed to get container status \"77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01\": rpc error: code = NotFound desc = could not find container \"77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01\": container with ID starting with 77f6db02adae7da1163c09eee1e5e2bffc8d17c48bc728ad0c779fd27cf98d01 not found: ID does not exist" Mar 10 22:50:33 crc kubenswrapper[4919]: I0310 22:50:33.493591 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" path="/var/lib/kubelet/pods/82748538-5bfe-448f-ba19-c9b8d61fe7bf/volumes" Mar 10 22:50:36 crc kubenswrapper[4919]: I0310 22:50:36.302867 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:36 crc kubenswrapper[4919]: I0310 22:50:36.303169 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:36 crc kubenswrapper[4919]: I0310 22:50:36.342775 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:37 crc kubenswrapper[4919]: I0310 22:50:37.300488 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:39 crc kubenswrapper[4919]: I0310 22:50:39.546077 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xml7f"] Mar 10 22:50:39 crc kubenswrapper[4919]: I0310 22:50:39.546499 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xml7f" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerName="registry-server" containerID="cri-o://59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea" gracePeriod=2 Mar 10 22:50:39 crc kubenswrapper[4919]: I0310 22:50:39.939967 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.084963 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-catalog-content\") pod \"1c0a2f43-f824-41dd-a701-71426bb01d90\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.085084 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p777\" (UniqueName: \"kubernetes.io/projected/1c0a2f43-f824-41dd-a701-71426bb01d90-kube-api-access-5p777\") pod \"1c0a2f43-f824-41dd-a701-71426bb01d90\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.085118 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-utilities\") pod \"1c0a2f43-f824-41dd-a701-71426bb01d90\" (UID: \"1c0a2f43-f824-41dd-a701-71426bb01d90\") " Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.086298 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-utilities" (OuterVolumeSpecName: "utilities") pod "1c0a2f43-f824-41dd-a701-71426bb01d90" (UID: "1c0a2f43-f824-41dd-a701-71426bb01d90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.096756 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0a2f43-f824-41dd-a701-71426bb01d90-kube-api-access-5p777" (OuterVolumeSpecName: "kube-api-access-5p777") pod "1c0a2f43-f824-41dd-a701-71426bb01d90" (UID: "1c0a2f43-f824-41dd-a701-71426bb01d90"). InnerVolumeSpecName "kube-api-access-5p777". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.150305 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c0a2f43-f824-41dd-a701-71426bb01d90" (UID: "1c0a2f43-f824-41dd-a701-71426bb01d90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.186751 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.186809 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p777\" (UniqueName: \"kubernetes.io/projected/1c0a2f43-f824-41dd-a701-71426bb01d90-kube-api-access-5p777\") on node \"crc\" DevicePath \"\"" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.186830 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c0a2f43-f824-41dd-a701-71426bb01d90-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.269914 4919 generic.go:334] "Generic (PLEG): container finished" podID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerID="59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea" exitCode=0 Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.269984 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xml7f" event={"ID":"1c0a2f43-f824-41dd-a701-71426bb01d90","Type":"ContainerDied","Data":"59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea"} Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.270245 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xml7f" event={"ID":"1c0a2f43-f824-41dd-a701-71426bb01d90","Type":"ContainerDied","Data":"b676745e46b648beb58ee91d2cf3927cb37d2532040058da9dd67a33ad82e2d5"} Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.270005 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xml7f" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.270323 4919 scope.go:117] "RemoveContainer" containerID="59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.307924 4919 scope.go:117] "RemoveContainer" containerID="0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.308945 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xml7f"] Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.316541 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xml7f"] Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.325480 4919 scope.go:117] "RemoveContainer" containerID="df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.354090 4919 scope.go:117] "RemoveContainer" containerID="59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea" Mar 10 22:50:40 crc kubenswrapper[4919]: E0310 22:50:40.355056 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea\": container with ID starting with 59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea not found: ID does not exist" containerID="59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.355104 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea"} err="failed to get container status \"59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea\": rpc error: code = NotFound desc = could not find container \"59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea\": container with ID starting with 59a115be5e81dc1c7e8dc7770b64bfdc2e1bdc8e04fc2cbd4ed801dca742e4ea not found: ID does not exist" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.355134 4919 scope.go:117] "RemoveContainer" containerID="0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667" Mar 10 22:50:40 crc kubenswrapper[4919]: E0310 22:50:40.355567 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667\": container with ID starting with 0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667 not found: ID does not exist" containerID="0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.355608 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667"} err="failed to get container status \"0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667\": rpc error: code = NotFound desc = could not find container \"0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667\": container with ID starting with 0e70790c76e904502434b7c488e0ac80e11ca7b0936ad8641bbe89dfb8636667 not found: ID does not exist" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.355634 4919 scope.go:117] "RemoveContainer" containerID="df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d" Mar 10 22:50:40 crc kubenswrapper[4919]: E0310 22:50:40.356034 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d\": container with ID starting with df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d not found: ID does not exist" containerID="df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d" Mar 10 22:50:40 crc kubenswrapper[4919]: I0310 22:50:40.356055 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d"} err="failed to get container status \"df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d\": rpc error: code = NotFound desc = could not find container \"df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d\": container with ID starting with df87257e4dfe48ce1ff7064f87cf7e4f58420f79eaec78d9b23c30b8c67a2e2d not found: ID does not exist" Mar 10 22:50:41 crc kubenswrapper[4919]: I0310 22:50:41.489564 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" path="/var/lib/kubelet/pods/1c0a2f43-f824-41dd-a701-71426bb01d90/volumes" Mar 10 22:50:42 crc kubenswrapper[4919]: I0310 22:50:42.480467 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:50:42 crc kubenswrapper[4919]: E0310 22:50:42.481158 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:50:45 crc kubenswrapper[4919]: I0310 22:50:45.079121 4919 scope.go:117] "RemoveContainer" containerID="24b071d9c6afe96c7c4104a8e4eb680b62e1194e450f831421ac6d143cc7dca4" Mar 10 22:50:54 crc kubenswrapper[4919]: I0310 22:50:54.480123 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:50:54 crc kubenswrapper[4919]: E0310 22:50:54.481038 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:51:08 crc kubenswrapper[4919]: I0310 22:51:08.480216 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:51:08 crc kubenswrapper[4919]: E0310 22:51:08.481429 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:51:19 crc kubenswrapper[4919]: I0310 22:51:19.479851 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:51:19 crc kubenswrapper[4919]: E0310 22:51:19.480611 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:51:34 crc kubenswrapper[4919]: I0310 22:51:34.483165 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:51:34 crc kubenswrapper[4919]: E0310 22:51:34.484878 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:51:47 crc kubenswrapper[4919]: I0310 22:51:47.479610 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:51:47 crc kubenswrapper[4919]: E0310 22:51:47.480369 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.164546 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553052-flhp6"] Mar 10 22:52:00 crc kubenswrapper[4919]: E0310 22:52:00.165854 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerName="extract-utilities" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.165885 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerName="extract-utilities" Mar 10 22:52:00 crc kubenswrapper[4919]: E0310 22:52:00.165922 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerName="extract-content" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.165939 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerName="extract-content" Mar 10 22:52:00 crc kubenswrapper[4919]: E0310 22:52:00.165959 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerName="extract-content" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.165978 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerName="extract-content" Mar 10 22:52:00 crc kubenswrapper[4919]: E0310 22:52:00.166018 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerName="registry-server" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.166034 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerName="registry-server" Mar 10 22:52:00 crc kubenswrapper[4919]: E0310 22:52:00.166062 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerName="extract-utilities" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.166078 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerName="extract-utilities" Mar 10 22:52:00 crc kubenswrapper[4919]: E0310 22:52:00.166105 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerName="registry-server" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.166124 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerName="registry-server" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.166463 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0a2f43-f824-41dd-a701-71426bb01d90" containerName="registry-server" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.178727 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="82748538-5bfe-448f-ba19-c9b8d61fe7bf" containerName="registry-server" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.179387 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553052-flhp6"] Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.179602 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553052-flhp6" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.183531 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.183550 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.184166 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.208865 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-866ld\" (UniqueName: \"kubernetes.io/projected/beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e-kube-api-access-866ld\") pod \"auto-csr-approver-29553052-flhp6\" (UID: \"beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e\") " pod="openshift-infra/auto-csr-approver-29553052-flhp6" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.310054 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-866ld\" (UniqueName: \"kubernetes.io/projected/beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e-kube-api-access-866ld\") pod \"auto-csr-approver-29553052-flhp6\" (UID: \"beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e\") " pod="openshift-infra/auto-csr-approver-29553052-flhp6" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.331883 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-866ld\" (UniqueName: \"kubernetes.io/projected/beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e-kube-api-access-866ld\") pod \"auto-csr-approver-29553052-flhp6\" (UID: \"beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e\") " pod="openshift-infra/auto-csr-approver-29553052-flhp6" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.506563 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553052-flhp6" Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.952794 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553052-flhp6"] Mar 10 22:52:00 crc kubenswrapper[4919]: I0310 22:52:00.964928 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:52:01 crc kubenswrapper[4919]: I0310 22:52:01.894970 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553052-flhp6" event={"ID":"beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e","Type":"ContainerStarted","Data":"a876fa90b1efbf76464547c07f845df0779015f70d87e1d2731ba69cfd38e430"} Mar 10 22:52:02 crc kubenswrapper[4919]: I0310 22:52:02.479769 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:52:02 crc kubenswrapper[4919]: E0310 22:52:02.480236 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:52:02 crc kubenswrapper[4919]: I0310 22:52:02.903374 4919 generic.go:334] "Generic (PLEG): container finished" podID="beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e" containerID="861d1215ae17cb620d6330ad57394075c429bcc79c18127c063d700cc5a158f6" exitCode=0 Mar 10 22:52:02 crc kubenswrapper[4919]: I0310 22:52:02.903484 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553052-flhp6" event={"ID":"beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e","Type":"ContainerDied","Data":"861d1215ae17cb620d6330ad57394075c429bcc79c18127c063d700cc5a158f6"} Mar 10 22:52:04 crc kubenswrapper[4919]: I0310 22:52:04.210194 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553052-flhp6" Mar 10 22:52:04 crc kubenswrapper[4919]: I0310 22:52:04.363190 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-866ld\" (UniqueName: \"kubernetes.io/projected/beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e-kube-api-access-866ld\") pod \"beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e\" (UID: \"beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e\") " Mar 10 22:52:04 crc kubenswrapper[4919]: I0310 22:52:04.368592 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e-kube-api-access-866ld" (OuterVolumeSpecName: "kube-api-access-866ld") pod "beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e" (UID: "beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e"). InnerVolumeSpecName "kube-api-access-866ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:52:04 crc kubenswrapper[4919]: I0310 22:52:04.465456 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-866ld\" (UniqueName: \"kubernetes.io/projected/beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e-kube-api-access-866ld\") on node \"crc\" DevicePath \"\"" Mar 10 22:52:04 crc kubenswrapper[4919]: I0310 22:52:04.921070 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553052-flhp6" event={"ID":"beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e","Type":"ContainerDied","Data":"a876fa90b1efbf76464547c07f845df0779015f70d87e1d2731ba69cfd38e430"} Mar 10 22:52:04 crc kubenswrapper[4919]: I0310 22:52:04.921376 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a876fa90b1efbf76464547c07f845df0779015f70d87e1d2731ba69cfd38e430" Mar 10 22:52:04 crc kubenswrapper[4919]: I0310 22:52:04.921169 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553052-flhp6" Mar 10 22:52:05 crc kubenswrapper[4919]: I0310 22:52:05.279596 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553046-kwmz8"] Mar 10 22:52:05 crc kubenswrapper[4919]: I0310 22:52:05.284810 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553046-kwmz8"] Mar 10 22:52:05 crc kubenswrapper[4919]: I0310 22:52:05.489165 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85c10bd-7458-4c76-8e95-8100a55aa96e" path="/var/lib/kubelet/pods/c85c10bd-7458-4c76-8e95-8100a55aa96e/volumes" Mar 10 22:52:15 crc kubenswrapper[4919]: I0310 22:52:15.479837 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:52:15 crc kubenswrapper[4919]: E0310 22:52:15.481955 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:52:29 crc kubenswrapper[4919]: I0310 22:52:29.480476 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:52:29 crc kubenswrapper[4919]: E0310 22:52:29.481320 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:52:41 crc kubenswrapper[4919]: I0310 22:52:41.481075 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:52:41 crc kubenswrapper[4919]: E0310 22:52:41.481759 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:52:45 crc kubenswrapper[4919]: I0310 22:52:45.175562 4919 scope.go:117] "RemoveContainer" containerID="07663a94e640ea3eaacb412322f9944eda606dc071acd3577c5e49e7d975048e" Mar 10 22:52:55 crc kubenswrapper[4919]: I0310 22:52:55.479876 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:52:55 crc kubenswrapper[4919]: E0310 22:52:55.480703 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:53:07 crc kubenswrapper[4919]: I0310 22:53:07.480064 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:53:07 crc kubenswrapper[4919]: E0310 22:53:07.481105 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:53:22 crc kubenswrapper[4919]: I0310 22:53:22.480074 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:53:22 crc kubenswrapper[4919]: E0310 22:53:22.480754 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:53:33 crc kubenswrapper[4919]: I0310 22:53:33.480500 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:53:33 crc kubenswrapper[4919]: E0310 22:53:33.481373 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:53:48 crc kubenswrapper[4919]: I0310 22:53:48.480817 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:53:48 crc kubenswrapper[4919]: E0310 22:53:48.482121 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.159780 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553054-wlswh"] Mar 10 22:54:00 crc kubenswrapper[4919]: E0310 22:54:00.160566 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e" containerName="oc" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.160582 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e" containerName="oc" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.160757 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e" containerName="oc" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.161270 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553054-wlswh" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.164356 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.165614 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.166284 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.174530 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553054-wlswh"] Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.228222 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56w9j\" (UniqueName: \"kubernetes.io/projected/464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94-kube-api-access-56w9j\") pod \"auto-csr-approver-29553054-wlswh\" (UID: \"464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94\") " pod="openshift-infra/auto-csr-approver-29553054-wlswh" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.328811 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56w9j\" (UniqueName: \"kubernetes.io/projected/464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94-kube-api-access-56w9j\") pod \"auto-csr-approver-29553054-wlswh\" (UID: \"464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94\") " pod="openshift-infra/auto-csr-approver-29553054-wlswh" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.361464 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56w9j\" (UniqueName: \"kubernetes.io/projected/464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94-kube-api-access-56w9j\") pod \"auto-csr-approver-29553054-wlswh\" (UID: \"464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94\") " pod="openshift-infra/auto-csr-approver-29553054-wlswh" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.498497 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553054-wlswh" Mar 10 22:54:00 crc kubenswrapper[4919]: I0310 22:54:00.923667 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553054-wlswh"] Mar 10 22:54:01 crc kubenswrapper[4919]: I0310 22:54:01.821228 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553054-wlswh" event={"ID":"464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94","Type":"ContainerStarted","Data":"fa38939e72104a40072d29de95a7f31eec9b798a81299fe60954ea9bd2ce1dcf"} Mar 10 22:54:02 crc kubenswrapper[4919]: I0310 22:54:02.479783 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:54:02 crc kubenswrapper[4919]: E0310 22:54:02.480339 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:54:02 crc kubenswrapper[4919]: I0310 22:54:02.830200 4919 generic.go:334] "Generic (PLEG): container finished" podID="464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94" containerID="a101c51ebe071dbd405d100b6da60a94012163e27c4b1d8bedd2ce8853daf0f1" exitCode=0 Mar 10 22:54:02 crc kubenswrapper[4919]: I0310 22:54:02.830241 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553054-wlswh" event={"ID":"464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94","Type":"ContainerDied","Data":"a101c51ebe071dbd405d100b6da60a94012163e27c4b1d8bedd2ce8853daf0f1"} Mar 10 22:54:04 crc kubenswrapper[4919]: I0310 22:54:04.167017 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553054-wlswh" Mar 10 22:54:04 crc kubenswrapper[4919]: I0310 22:54:04.281417 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56w9j\" (UniqueName: \"kubernetes.io/projected/464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94-kube-api-access-56w9j\") pod \"464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94\" (UID: \"464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94\") " Mar 10 22:54:04 crc kubenswrapper[4919]: I0310 22:54:04.286934 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94-kube-api-access-56w9j" (OuterVolumeSpecName: "kube-api-access-56w9j") pod "464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94" (UID: "464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94"). InnerVolumeSpecName "kube-api-access-56w9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:54:04 crc kubenswrapper[4919]: I0310 22:54:04.383304 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56w9j\" (UniqueName: \"kubernetes.io/projected/464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94-kube-api-access-56w9j\") on node \"crc\" DevicePath \"\"" Mar 10 22:54:04 crc kubenswrapper[4919]: I0310 22:54:04.843485 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553054-wlswh" event={"ID":"464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94","Type":"ContainerDied","Data":"fa38939e72104a40072d29de95a7f31eec9b798a81299fe60954ea9bd2ce1dcf"} Mar 10 22:54:04 crc kubenswrapper[4919]: I0310 22:54:04.843528 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa38939e72104a40072d29de95a7f31eec9b798a81299fe60954ea9bd2ce1dcf" Mar 10 22:54:04 crc kubenswrapper[4919]: I0310 22:54:04.843862 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553054-wlswh" Mar 10 22:54:05 crc kubenswrapper[4919]: I0310 22:54:05.245749 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553048-f57pj"] Mar 10 22:54:05 crc kubenswrapper[4919]: I0310 22:54:05.250968 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553048-f57pj"] Mar 10 22:54:05 crc kubenswrapper[4919]: I0310 22:54:05.494960 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83298b5-dee4-4e12-aed2-011c8a6e2b30" path="/var/lib/kubelet/pods/e83298b5-dee4-4e12-aed2-011c8a6e2b30/volumes" Mar 10 22:54:14 crc kubenswrapper[4919]: I0310 22:54:14.480192 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:54:14 crc kubenswrapper[4919]: E0310 22:54:14.481210 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:54:25 crc kubenswrapper[4919]: I0310 22:54:25.481091 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:54:25 crc kubenswrapper[4919]: E0310 22:54:25.482362 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:54:38 crc kubenswrapper[4919]: I0310 22:54:38.480920 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:54:38 crc kubenswrapper[4919]: E0310 22:54:38.481729 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:54:45 crc kubenswrapper[4919]: I0310 22:54:45.262381 4919 scope.go:117] "RemoveContainer" containerID="af9836be971ca09598235169839f6617eb1d2ad9f6d9f052df699df6ef0bcfe0" Mar 10 22:54:51 crc kubenswrapper[4919]: I0310 22:54:51.481779 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:54:51 crc kubenswrapper[4919]: E0310 22:54:51.483111 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:55:03 crc kubenswrapper[4919]: I0310 22:55:03.490629 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:55:03 crc kubenswrapper[4919]: E0310 22:55:03.491492 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:55:14 crc kubenswrapper[4919]: I0310 22:55:14.480588 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:55:14 crc kubenswrapper[4919]: E0310 22:55:14.481203 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 22:55:29 crc kubenswrapper[4919]: I0310 22:55:29.479855 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:55:30 crc kubenswrapper[4919]: I0310 22:55:30.571082 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"53e8434ecb83f239bb7e4814454c11ba1c0448d2d6ba0068e5ca799c07cd5409"} Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.161475 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553056-mxpkl"] Mar 10 22:56:00 crc kubenswrapper[4919]: E0310 22:56:00.162749 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94" containerName="oc" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.162784 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94" containerName="oc" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.163177 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94" containerName="oc" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.164158 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.173501 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.173876 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.175071 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.175527 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553056-mxpkl"] Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.246530 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs5xg\" (UniqueName: \"kubernetes.io/projected/9275d8f0-0612-4db0-8167-d891ba7efb59-kube-api-access-cs5xg\") pod \"auto-csr-approver-29553056-mxpkl\" (UID: \"9275d8f0-0612-4db0-8167-d891ba7efb59\") " pod="openshift-infra/auto-csr-approver-29553056-mxpkl" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.348899 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs5xg\" (UniqueName: \"kubernetes.io/projected/9275d8f0-0612-4db0-8167-d891ba7efb59-kube-api-access-cs5xg\") pod \"auto-csr-approver-29553056-mxpkl\" (UID: \"9275d8f0-0612-4db0-8167-d891ba7efb59\") " pod="openshift-infra/auto-csr-approver-29553056-mxpkl" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.386776 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs5xg\" (UniqueName: \"kubernetes.io/projected/9275d8f0-0612-4db0-8167-d891ba7efb59-kube-api-access-cs5xg\") pod \"auto-csr-approver-29553056-mxpkl\" (UID: \"9275d8f0-0612-4db0-8167-d891ba7efb59\") " pod="openshift-infra/auto-csr-approver-29553056-mxpkl" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.484485 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" Mar 10 22:56:00 crc kubenswrapper[4919]: I0310 22:56:00.920126 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553056-mxpkl"] Mar 10 22:56:01 crc kubenswrapper[4919]: I0310 22:56:01.840779 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" event={"ID":"9275d8f0-0612-4db0-8167-d891ba7efb59","Type":"ContainerStarted","Data":"f43882c0fa9f110e4c209facd6f04b282ee1bddc2503aef4364553420e7ac27d"} Mar 10 22:56:02 crc kubenswrapper[4919]: I0310 22:56:02.850326 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" event={"ID":"9275d8f0-0612-4db0-8167-d891ba7efb59","Type":"ContainerStarted","Data":"710ded4218c98dbee0a107c0bad53217eb0af727bc1e9d9b06f510dbc2f03318"} Mar 10 22:56:02 crc kubenswrapper[4919]: I0310 22:56:02.866565 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" podStartSLOduration=1.7788737989999999 podStartE2EDuration="2.866543809s" podCreationTimestamp="2026-03-10 22:56:00 +0000 UTC" firstStartedPulling="2026-03-10 22:56:00.932416712 +0000 UTC m=+3948.174297320" lastFinishedPulling="2026-03-10 22:56:02.020086722 +0000 UTC m=+3949.261967330" observedRunningTime="2026-03-10 22:56:02.864162593 +0000 UTC m=+3950.106043221" watchObservedRunningTime="2026-03-10 22:56:02.866543809 +0000 UTC m=+3950.108424407" Mar 10 22:56:03 crc kubenswrapper[4919]: I0310 22:56:03.860400 4919 generic.go:334] "Generic (PLEG): container finished" podID="9275d8f0-0612-4db0-8167-d891ba7efb59" containerID="710ded4218c98dbee0a107c0bad53217eb0af727bc1e9d9b06f510dbc2f03318" exitCode=0 Mar 10 22:56:03 crc kubenswrapper[4919]: I0310 22:56:03.860480 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" event={"ID":"9275d8f0-0612-4db0-8167-d891ba7efb59","Type":"ContainerDied","Data":"710ded4218c98dbee0a107c0bad53217eb0af727bc1e9d9b06f510dbc2f03318"} Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.154762 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.319755 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs5xg\" (UniqueName: \"kubernetes.io/projected/9275d8f0-0612-4db0-8167-d891ba7efb59-kube-api-access-cs5xg\") pod \"9275d8f0-0612-4db0-8167-d891ba7efb59\" (UID: \"9275d8f0-0612-4db0-8167-d891ba7efb59\") " Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.324855 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9275d8f0-0612-4db0-8167-d891ba7efb59-kube-api-access-cs5xg" (OuterVolumeSpecName: "kube-api-access-cs5xg") pod "9275d8f0-0612-4db0-8167-d891ba7efb59" (UID: "9275d8f0-0612-4db0-8167-d891ba7efb59"). InnerVolumeSpecName "kube-api-access-cs5xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.421626 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs5xg\" (UniqueName: \"kubernetes.io/projected/9275d8f0-0612-4db0-8167-d891ba7efb59-kube-api-access-cs5xg\") on node \"crc\" DevicePath \"\"" Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.881425 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" event={"ID":"9275d8f0-0612-4db0-8167-d891ba7efb59","Type":"ContainerDied","Data":"f43882c0fa9f110e4c209facd6f04b282ee1bddc2503aef4364553420e7ac27d"} Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.881467 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553056-mxpkl" Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.881482 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f43882c0fa9f110e4c209facd6f04b282ee1bddc2503aef4364553420e7ac27d" Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.950007 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553050-mq9qr"] Mar 10 22:56:05 crc kubenswrapper[4919]: I0310 22:56:05.957512 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553050-mq9qr"] Mar 10 22:56:07 crc kubenswrapper[4919]: I0310 22:56:07.515886 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6d7798-e079-471b-92ee-b895caeff582" path="/var/lib/kubelet/pods/eb6d7798-e079-471b-92ee-b895caeff582/volumes" Mar 10 22:56:45 crc kubenswrapper[4919]: I0310 22:56:45.375166 4919 scope.go:117] "RemoveContainer" containerID="5604fcdf3e68725135d51ad90f25ecd86367e7b42b3f0ff7c4087345c87db9d7" Mar 10 22:57:29 crc kubenswrapper[4919]: I0310 22:57:29.175559 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:57:29 crc kubenswrapper[4919]: I0310 22:57:29.176634 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:57:59 crc kubenswrapper[4919]: I0310 22:57:59.175489 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:57:59 crc kubenswrapper[4919]: I0310 22:57:59.176200 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.161953 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553058-f7p8l"] Mar 10 22:58:00 crc kubenswrapper[4919]: E0310 22:58:00.162977 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9275d8f0-0612-4db0-8167-d891ba7efb59" containerName="oc" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.163005 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9275d8f0-0612-4db0-8167-d891ba7efb59" containerName="oc" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.163228 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9275d8f0-0612-4db0-8167-d891ba7efb59" containerName="oc" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.164033 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.167557 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.167586 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.167737 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.190527 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553058-f7p8l"] Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.201407 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppvjl\" (UniqueName: \"kubernetes.io/projected/365d2b3f-3786-49df-937f-b35e69aca426-kube-api-access-ppvjl\") pod \"auto-csr-approver-29553058-f7p8l\" (UID: \"365d2b3f-3786-49df-937f-b35e69aca426\") " pod="openshift-infra/auto-csr-approver-29553058-f7p8l" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.302788 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppvjl\" (UniqueName: \"kubernetes.io/projected/365d2b3f-3786-49df-937f-b35e69aca426-kube-api-access-ppvjl\") pod \"auto-csr-approver-29553058-f7p8l\" (UID: \"365d2b3f-3786-49df-937f-b35e69aca426\") " pod="openshift-infra/auto-csr-approver-29553058-f7p8l" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.321185 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppvjl\" (UniqueName: \"kubernetes.io/projected/365d2b3f-3786-49df-937f-b35e69aca426-kube-api-access-ppvjl\") pod \"auto-csr-approver-29553058-f7p8l\" (UID: \"365d2b3f-3786-49df-937f-b35e69aca426\") " pod="openshift-infra/auto-csr-approver-29553058-f7p8l" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.504803 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.947922 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553058-f7p8l"] Mar 10 22:58:00 crc kubenswrapper[4919]: I0310 22:58:00.969386 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 22:58:01 crc kubenswrapper[4919]: I0310 22:58:01.247947 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" event={"ID":"365d2b3f-3786-49df-937f-b35e69aca426","Type":"ContainerStarted","Data":"6d644e5c59ea297d1816480e5825f7e1001d05005269334cdc508bf50b0b09d8"} Mar 10 22:58:02 crc kubenswrapper[4919]: I0310 22:58:02.257661 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" event={"ID":"365d2b3f-3786-49df-937f-b35e69aca426","Type":"ContainerStarted","Data":"11256ff6e4405d720c8472d5ed25a2f1c54da1f60ee12ae4580e7dc869b895c7"} Mar 10 22:58:02 crc kubenswrapper[4919]: I0310 22:58:02.274913 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" podStartSLOduration=1.391411349 podStartE2EDuration="2.274895486s" podCreationTimestamp="2026-03-10 22:58:00 +0000 UTC" firstStartedPulling="2026-03-10 22:58:00.969097881 +0000 UTC m=+4068.210978499" lastFinishedPulling="2026-03-10 22:58:01.852582028 +0000 UTC m=+4069.094462636" observedRunningTime="2026-03-10 22:58:02.269336334 +0000 UTC m=+4069.511216942" watchObservedRunningTime="2026-03-10 22:58:02.274895486 +0000 UTC m=+4069.516776094" Mar 10 22:58:03 crc kubenswrapper[4919]: I0310 22:58:03.267254 4919 generic.go:334] "Generic (PLEG): container finished" podID="365d2b3f-3786-49df-937f-b35e69aca426" containerID="11256ff6e4405d720c8472d5ed25a2f1c54da1f60ee12ae4580e7dc869b895c7" exitCode=0 Mar 10 22:58:03 crc kubenswrapper[4919]: I0310 22:58:03.267461 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" event={"ID":"365d2b3f-3786-49df-937f-b35e69aca426","Type":"ContainerDied","Data":"11256ff6e4405d720c8472d5ed25a2f1c54da1f60ee12ae4580e7dc869b895c7"} Mar 10 22:58:04 crc kubenswrapper[4919]: I0310 22:58:04.583753 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" Mar 10 22:58:04 crc kubenswrapper[4919]: I0310 22:58:04.761722 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppvjl\" (UniqueName: \"kubernetes.io/projected/365d2b3f-3786-49df-937f-b35e69aca426-kube-api-access-ppvjl\") pod \"365d2b3f-3786-49df-937f-b35e69aca426\" (UID: \"365d2b3f-3786-49df-937f-b35e69aca426\") " Mar 10 22:58:04 crc kubenswrapper[4919]: I0310 22:58:04.768238 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365d2b3f-3786-49df-937f-b35e69aca426-kube-api-access-ppvjl" (OuterVolumeSpecName: "kube-api-access-ppvjl") pod "365d2b3f-3786-49df-937f-b35e69aca426" (UID: "365d2b3f-3786-49df-937f-b35e69aca426"). InnerVolumeSpecName "kube-api-access-ppvjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:58:04 crc kubenswrapper[4919]: I0310 22:58:04.863600 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppvjl\" (UniqueName: \"kubernetes.io/projected/365d2b3f-3786-49df-937f-b35e69aca426-kube-api-access-ppvjl\") on node \"crc\" DevicePath \"\"" Mar 10 22:58:05 crc kubenswrapper[4919]: I0310 22:58:05.287520 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" event={"ID":"365d2b3f-3786-49df-937f-b35e69aca426","Type":"ContainerDied","Data":"6d644e5c59ea297d1816480e5825f7e1001d05005269334cdc508bf50b0b09d8"} Mar 10 22:58:05 crc kubenswrapper[4919]: I0310 22:58:05.287559 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d644e5c59ea297d1816480e5825f7e1001d05005269334cdc508bf50b0b09d8" Mar 10 22:58:05 crc kubenswrapper[4919]: I0310 22:58:05.287617 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553058-f7p8l" Mar 10 22:58:05 crc kubenswrapper[4919]: I0310 22:58:05.350694 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553052-flhp6"] Mar 10 22:58:05 crc kubenswrapper[4919]: I0310 22:58:05.355922 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553052-flhp6"] Mar 10 22:58:05 crc kubenswrapper[4919]: I0310 22:58:05.488374 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e" path="/var/lib/kubelet/pods/beaa083c-7b7d-4e9f-b29b-5b3d8fa1265e/volumes" Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.175749 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.176476 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.176530 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.177193 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53e8434ecb83f239bb7e4814454c11ba1c0448d2d6ba0068e5ca799c07cd5409"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.177259 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://53e8434ecb83f239bb7e4814454c11ba1c0448d2d6ba0068e5ca799c07cd5409" gracePeriod=600 Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.508732 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="53e8434ecb83f239bb7e4814454c11ba1c0448d2d6ba0068e5ca799c07cd5409" exitCode=0 Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.509147 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"53e8434ecb83f239bb7e4814454c11ba1c0448d2d6ba0068e5ca799c07cd5409"} Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.509181 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6"} Mar 10 22:58:29 crc kubenswrapper[4919]: I0310 22:58:29.509202 4919 scope.go:117] "RemoveContainer" containerID="04b44634b44ad034f1ca0fde2bacc28827ffaa56935af87501bd193efff921b2" Mar 10 22:58:45 crc kubenswrapper[4919]: I0310 22:58:45.558502 4919 scope.go:117] "RemoveContainer" containerID="861d1215ae17cb620d6330ad57394075c429bcc79c18127c063d700cc5a158f6" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.238041 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shxst"] Mar 10 22:59:17 crc kubenswrapper[4919]: E0310 22:59:17.240015 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365d2b3f-3786-49df-937f-b35e69aca426" containerName="oc" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.240094 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d2b3f-3786-49df-937f-b35e69aca426" containerName="oc" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.240282 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="365d2b3f-3786-49df-937f-b35e69aca426" containerName="oc" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.241345 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.262285 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shxst"] Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.399561 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-catalog-content\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.399702 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64m6j\" (UniqueName: \"kubernetes.io/projected/a3cd4076-5a60-4652-91b0-ebc4451494e3-kube-api-access-64m6j\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.399743 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-utilities\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.501176 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64m6j\" (UniqueName: \"kubernetes.io/projected/a3cd4076-5a60-4652-91b0-ebc4451494e3-kube-api-access-64m6j\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.501246 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-utilities\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.501290 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-catalog-content\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.501891 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-catalog-content\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.502205 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-utilities\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.522480 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64m6j\" (UniqueName: \"kubernetes.io/projected/a3cd4076-5a60-4652-91b0-ebc4451494e3-kube-api-access-64m6j\") pod \"certified-operators-shxst\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:17 crc kubenswrapper[4919]: I0310 22:59:17.564470 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:18 crc kubenswrapper[4919]: I0310 22:59:18.059580 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shxst"] Mar 10 22:59:18 crc kubenswrapper[4919]: I0310 22:59:18.848490 4919 generic.go:334] "Generic (PLEG): container finished" podID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerID="bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee" exitCode=0 Mar 10 22:59:18 crc kubenswrapper[4919]: I0310 22:59:18.848544 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shxst" event={"ID":"a3cd4076-5a60-4652-91b0-ebc4451494e3","Type":"ContainerDied","Data":"bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee"} Mar 10 22:59:18 crc kubenswrapper[4919]: I0310 22:59:18.848790 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shxst" event={"ID":"a3cd4076-5a60-4652-91b0-ebc4451494e3","Type":"ContainerStarted","Data":"a4c6c25392e7f198bd2f7c38808794d269049cd246a4cb4d99456d9334dfb6d5"} Mar 10 22:59:20 crc kubenswrapper[4919]: I0310 22:59:20.864969 4919 generic.go:334] "Generic (PLEG): container finished" podID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerID="d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86" exitCode=0 Mar 10 22:59:20 crc kubenswrapper[4919]: I0310 22:59:20.865157 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shxst" event={"ID":"a3cd4076-5a60-4652-91b0-ebc4451494e3","Type":"ContainerDied","Data":"d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86"} Mar 10 22:59:22 crc kubenswrapper[4919]: I0310 22:59:22.881905 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shxst" event={"ID":"a3cd4076-5a60-4652-91b0-ebc4451494e3","Type":"ContainerStarted","Data":"6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2"} Mar 10 22:59:22 crc kubenswrapper[4919]: I0310 22:59:22.905655 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shxst" podStartSLOduration=3.049492574 podStartE2EDuration="5.905631059s" podCreationTimestamp="2026-03-10 22:59:17 +0000 UTC" firstStartedPulling="2026-03-10 22:59:18.850913755 +0000 UTC m=+4146.092794363" lastFinishedPulling="2026-03-10 22:59:21.70705224 +0000 UTC m=+4148.948932848" observedRunningTime="2026-03-10 22:59:22.903228003 +0000 UTC m=+4150.145108611" watchObservedRunningTime="2026-03-10 22:59:22.905631059 +0000 UTC m=+4150.147511687" Mar 10 22:59:27 crc kubenswrapper[4919]: I0310 22:59:27.565499 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:27 crc kubenswrapper[4919]: I0310 22:59:27.566826 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:27 crc kubenswrapper[4919]: I0310 22:59:27.620705 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:27 crc kubenswrapper[4919]: I0310 22:59:27.955656 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:27 crc kubenswrapper[4919]: I0310 22:59:27.998984 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shxst"] Mar 10 22:59:29 crc kubenswrapper[4919]: I0310 22:59:29.934447 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-shxst" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerName="registry-server" containerID="cri-o://6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2" gracePeriod=2 Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.349277 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.479422 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64m6j\" (UniqueName: \"kubernetes.io/projected/a3cd4076-5a60-4652-91b0-ebc4451494e3-kube-api-access-64m6j\") pod \"a3cd4076-5a60-4652-91b0-ebc4451494e3\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.480517 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-catalog-content\") pod \"a3cd4076-5a60-4652-91b0-ebc4451494e3\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.480670 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-utilities\") pod \"a3cd4076-5a60-4652-91b0-ebc4451494e3\" (UID: \"a3cd4076-5a60-4652-91b0-ebc4451494e3\") " Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.481523 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-utilities" (OuterVolumeSpecName: "utilities") pod "a3cd4076-5a60-4652-91b0-ebc4451494e3" (UID: "a3cd4076-5a60-4652-91b0-ebc4451494e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.486131 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3cd4076-5a60-4652-91b0-ebc4451494e3-kube-api-access-64m6j" (OuterVolumeSpecName: "kube-api-access-64m6j") pod "a3cd4076-5a60-4652-91b0-ebc4451494e3" (UID: "a3cd4076-5a60-4652-91b0-ebc4451494e3"). InnerVolumeSpecName "kube-api-access-64m6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.553733 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3cd4076-5a60-4652-91b0-ebc4451494e3" (UID: "a3cd4076-5a60-4652-91b0-ebc4451494e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.582194 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.582222 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64m6j\" (UniqueName: \"kubernetes.io/projected/a3cd4076-5a60-4652-91b0-ebc4451494e3-kube-api-access-64m6j\") on node \"crc\" DevicePath \"\"" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.582234 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cd4076-5a60-4652-91b0-ebc4451494e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.944240 4919 generic.go:334] "Generic (PLEG): container finished" podID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerID="6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2" exitCode=0 Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.944280 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shxst" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.944299 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shxst" event={"ID":"a3cd4076-5a60-4652-91b0-ebc4451494e3","Type":"ContainerDied","Data":"6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2"} Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.944339 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shxst" event={"ID":"a3cd4076-5a60-4652-91b0-ebc4451494e3","Type":"ContainerDied","Data":"a4c6c25392e7f198bd2f7c38808794d269049cd246a4cb4d99456d9334dfb6d5"} Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.944358 4919 scope.go:117] "RemoveContainer" containerID="6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.962069 4919 scope.go:117] "RemoveContainer" containerID="d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.980972 4919 scope.go:117] "RemoveContainer" containerID="bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee" Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.984230 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shxst"] Mar 10 22:59:30 crc kubenswrapper[4919]: I0310 22:59:30.992737 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-shxst"] Mar 10 22:59:31 crc kubenswrapper[4919]: I0310 22:59:31.009291 4919 scope.go:117] "RemoveContainer" containerID="6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2" Mar 10 22:59:31 crc kubenswrapper[4919]: E0310 22:59:31.009732 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2\": container with ID starting with 6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2 not found: ID does not exist" containerID="6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2" Mar 10 22:59:31 crc kubenswrapper[4919]: I0310 22:59:31.009776 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2"} err="failed to get container status \"6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2\": rpc error: code = NotFound desc = could not find container \"6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2\": container with ID starting with 6a1f8f065338485758fd632db00bcf065542e111cdb1c42c4813480a01eec2b2 not found: ID does not exist" Mar 10 22:59:31 crc kubenswrapper[4919]: I0310 22:59:31.009801 4919 scope.go:117] "RemoveContainer" containerID="d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86" Mar 10 22:59:31 crc kubenswrapper[4919]: E0310 22:59:31.010187 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86\": container with ID starting with d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86 not found: ID does not exist" containerID="d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86" Mar 10 22:59:31 crc kubenswrapper[4919]: I0310 22:59:31.010301 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86"} err="failed to get container status \"d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86\": rpc error: code = NotFound desc = could not find container \"d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86\": container with ID starting with d3d30df339eeaee0685f608b393d127abbcec7de47f1256e7702398383600e86 not found: ID does not exist" Mar 10 22:59:31 crc kubenswrapper[4919]: I0310 22:59:31.010409 4919 scope.go:117] "RemoveContainer" containerID="bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee" Mar 10 22:59:31 crc kubenswrapper[4919]: E0310 22:59:31.010782 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee\": container with ID starting with bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee not found: ID does not exist" containerID="bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee" Mar 10 22:59:31 crc kubenswrapper[4919]: I0310 22:59:31.010881 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee"} err="failed to get container status \"bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee\": rpc error: code = NotFound desc = could not find container \"bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee\": container with ID starting with bde82462d9f09f0c91dfbfbf691c89b055d18de9a39fdbf147064efdf601a6ee not found: ID does not exist" Mar 10 22:59:31 crc kubenswrapper[4919]: I0310 22:59:31.489025 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" path="/var/lib/kubelet/pods/a3cd4076-5a60-4652-91b0-ebc4451494e3/volumes" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.146265 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553060-jbjdx"] Mar 10 23:00:00 crc kubenswrapper[4919]: E0310 23:00:00.147669 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerName="extract-content" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.147694 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerName="extract-content" Mar 10 23:00:00 crc kubenswrapper[4919]: E0310 23:00:00.147770 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerName="extract-utilities" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.147786 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerName="extract-utilities" Mar 10 23:00:00 crc kubenswrapper[4919]: E0310 23:00:00.147841 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerName="registry-server" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.147855 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerName="registry-server" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.148219 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cd4076-5a60-4652-91b0-ebc4451494e3" containerName="registry-server" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.149274 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553060-jbjdx" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.149568 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67"] Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.151584 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.152108 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.152816 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.154249 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.158080 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.158358 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.158553 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553060-jbjdx"] Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.170618 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67"] Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.206253 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31be40f-53ab-4748-b8df-3aa93593e3b5-config-volume\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.206305 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31be40f-53ab-4748-b8df-3aa93593e3b5-secret-volume\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.206331 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vc4\" (UniqueName: \"kubernetes.io/projected/c31be40f-53ab-4748-b8df-3aa93593e3b5-kube-api-access-c9vc4\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.206354 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb7ll\" (UniqueName: \"kubernetes.io/projected/53394bb5-86c2-4638-9926-0f1418a4e741-kube-api-access-qb7ll\") pod \"auto-csr-approver-29553060-jbjdx\" (UID: \"53394bb5-86c2-4638-9926-0f1418a4e741\") " pod="openshift-infra/auto-csr-approver-29553060-jbjdx" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.308050 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vc4\" (UniqueName: \"kubernetes.io/projected/c31be40f-53ab-4748-b8df-3aa93593e3b5-kube-api-access-c9vc4\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.308131 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb7ll\" (UniqueName: \"kubernetes.io/projected/53394bb5-86c2-4638-9926-0f1418a4e741-kube-api-access-qb7ll\") pod \"auto-csr-approver-29553060-jbjdx\" (UID: \"53394bb5-86c2-4638-9926-0f1418a4e741\") " pod="openshift-infra/auto-csr-approver-29553060-jbjdx" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.308298 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31be40f-53ab-4748-b8df-3aa93593e3b5-config-volume\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.308372 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31be40f-53ab-4748-b8df-3aa93593e3b5-secret-volume\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.309377 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31be40f-53ab-4748-b8df-3aa93593e3b5-config-volume\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.314705 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31be40f-53ab-4748-b8df-3aa93593e3b5-secret-volume\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.324157 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb7ll\" (UniqueName: \"kubernetes.io/projected/53394bb5-86c2-4638-9926-0f1418a4e741-kube-api-access-qb7ll\") pod \"auto-csr-approver-29553060-jbjdx\" (UID: \"53394bb5-86c2-4638-9926-0f1418a4e741\") " pod="openshift-infra/auto-csr-approver-29553060-jbjdx" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.326926 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vc4\" (UniqueName: \"kubernetes.io/projected/c31be40f-53ab-4748-b8df-3aa93593e3b5-kube-api-access-c9vc4\") pod \"collect-profiles-29553060-x2l67\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.482098 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553060-jbjdx" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.506178 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.903283 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553060-jbjdx"] Mar 10 23:00:00 crc kubenswrapper[4919]: I0310 23:00:00.958121 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67"] Mar 10 23:00:00 crc kubenswrapper[4919]: W0310 23:00:00.962369 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31be40f_53ab_4748_b8df_3aa93593e3b5.slice/crio-0b97efb6904a30012dda49568321d6c99a9ff7dbbb54993684df5f8758431b7e WatchSource:0}: Error finding container 0b97efb6904a30012dda49568321d6c99a9ff7dbbb54993684df5f8758431b7e: Status 404 returned error can't find the container with id 0b97efb6904a30012dda49568321d6c99a9ff7dbbb54993684df5f8758431b7e Mar 10 23:00:01 crc kubenswrapper[4919]: I0310 23:00:01.182595 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" event={"ID":"c31be40f-53ab-4748-b8df-3aa93593e3b5","Type":"ContainerStarted","Data":"99a8804d05f7fc8e266f03dedff28752026dcc94407b9d1241858632c5581e1c"} Mar 10 23:00:01 crc kubenswrapper[4919]: I0310 23:00:01.182636 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" event={"ID":"c31be40f-53ab-4748-b8df-3aa93593e3b5","Type":"ContainerStarted","Data":"0b97efb6904a30012dda49568321d6c99a9ff7dbbb54993684df5f8758431b7e"} Mar 10 23:00:01 crc kubenswrapper[4919]: I0310 23:00:01.185941 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553060-jbjdx" event={"ID":"53394bb5-86c2-4638-9926-0f1418a4e741","Type":"ContainerStarted","Data":"5b67e79a9a53bf4180d743932f3af9c20f29afd76ccdb010d81fbc5a24bedb51"} Mar 10 23:00:01 crc kubenswrapper[4919]: I0310 23:00:01.202741 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" podStartSLOduration=1.202720378 podStartE2EDuration="1.202720378s" podCreationTimestamp="2026-03-10 23:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:00:01.202675946 +0000 UTC m=+4188.444556564" watchObservedRunningTime="2026-03-10 23:00:01.202720378 +0000 UTC m=+4188.444600996" Mar 10 23:00:02 crc kubenswrapper[4919]: I0310 23:00:02.195151 4919 generic.go:334] "Generic (PLEG): container finished" podID="c31be40f-53ab-4748-b8df-3aa93593e3b5" containerID="99a8804d05f7fc8e266f03dedff28752026dcc94407b9d1241858632c5581e1c" exitCode=0 Mar 10 23:00:02 crc kubenswrapper[4919]: I0310 23:00:02.195200 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" event={"ID":"c31be40f-53ab-4748-b8df-3aa93593e3b5","Type":"ContainerDied","Data":"99a8804d05f7fc8e266f03dedff28752026dcc94407b9d1241858632c5581e1c"} Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.539316 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.653736 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9vc4\" (UniqueName: \"kubernetes.io/projected/c31be40f-53ab-4748-b8df-3aa93593e3b5-kube-api-access-c9vc4\") pod \"c31be40f-53ab-4748-b8df-3aa93593e3b5\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.653888 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31be40f-53ab-4748-b8df-3aa93593e3b5-config-volume\") pod \"c31be40f-53ab-4748-b8df-3aa93593e3b5\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.653987 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31be40f-53ab-4748-b8df-3aa93593e3b5-secret-volume\") pod \"c31be40f-53ab-4748-b8df-3aa93593e3b5\" (UID: \"c31be40f-53ab-4748-b8df-3aa93593e3b5\") " Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.654745 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31be40f-53ab-4748-b8df-3aa93593e3b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "c31be40f-53ab-4748-b8df-3aa93593e3b5" (UID: "c31be40f-53ab-4748-b8df-3aa93593e3b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.658812 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31be40f-53ab-4748-b8df-3aa93593e3b5-kube-api-access-c9vc4" (OuterVolumeSpecName: "kube-api-access-c9vc4") pod "c31be40f-53ab-4748-b8df-3aa93593e3b5" (UID: "c31be40f-53ab-4748-b8df-3aa93593e3b5"). InnerVolumeSpecName "kube-api-access-c9vc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.659364 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31be40f-53ab-4748-b8df-3aa93593e3b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c31be40f-53ab-4748-b8df-3aa93593e3b5" (UID: "c31be40f-53ab-4748-b8df-3aa93593e3b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.756206 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31be40f-53ab-4748-b8df-3aa93593e3b5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.756243 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31be40f-53ab-4748-b8df-3aa93593e3b5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 23:00:03 crc kubenswrapper[4919]: I0310 23:00:03.756255 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9vc4\" (UniqueName: \"kubernetes.io/projected/c31be40f-53ab-4748-b8df-3aa93593e3b5-kube-api-access-c9vc4\") on node \"crc\" DevicePath \"\"" Mar 10 23:00:04 crc kubenswrapper[4919]: I0310 23:00:04.235429 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" event={"ID":"c31be40f-53ab-4748-b8df-3aa93593e3b5","Type":"ContainerDied","Data":"0b97efb6904a30012dda49568321d6c99a9ff7dbbb54993684df5f8758431b7e"} Mar 10 23:00:04 crc kubenswrapper[4919]: I0310 23:00:04.235488 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b97efb6904a30012dda49568321d6c99a9ff7dbbb54993684df5f8758431b7e" Mar 10 23:00:04 crc kubenswrapper[4919]: I0310 23:00:04.235503 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67" Mar 10 23:00:04 crc kubenswrapper[4919]: I0310 23:00:04.297636 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg"] Mar 10 23:00:04 crc kubenswrapper[4919]: I0310 23:00:04.305121 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553015-mmvrg"] Mar 10 23:00:05 crc kubenswrapper[4919]: I0310 23:00:05.245792 4919 generic.go:334] "Generic (PLEG): container finished" podID="53394bb5-86c2-4638-9926-0f1418a4e741" containerID="b220d44277abe0c9192aa1f257399bda3df2fe47eed1000ca768116e09c8cfcd" exitCode=0 Mar 10 23:00:05 crc kubenswrapper[4919]: I0310 23:00:05.245870 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553060-jbjdx" event={"ID":"53394bb5-86c2-4638-9926-0f1418a4e741","Type":"ContainerDied","Data":"b220d44277abe0c9192aa1f257399bda3df2fe47eed1000ca768116e09c8cfcd"} Mar 10 23:00:05 crc kubenswrapper[4919]: I0310 23:00:05.488281 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7179315d-d730-4a2f-8ed2-7b06ff2fd2ff" path="/var/lib/kubelet/pods/7179315d-d730-4a2f-8ed2-7b06ff2fd2ff/volumes" Mar 10 23:00:06 crc kubenswrapper[4919]: I0310 23:00:06.539627 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553060-jbjdx" Mar 10 23:00:06 crc kubenswrapper[4919]: I0310 23:00:06.698150 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb7ll\" (UniqueName: \"kubernetes.io/projected/53394bb5-86c2-4638-9926-0f1418a4e741-kube-api-access-qb7ll\") pod \"53394bb5-86c2-4638-9926-0f1418a4e741\" (UID: \"53394bb5-86c2-4638-9926-0f1418a4e741\") " Mar 10 23:00:06 crc kubenswrapper[4919]: I0310 23:00:06.703189 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53394bb5-86c2-4638-9926-0f1418a4e741-kube-api-access-qb7ll" (OuterVolumeSpecName: "kube-api-access-qb7ll") pod "53394bb5-86c2-4638-9926-0f1418a4e741" (UID: "53394bb5-86c2-4638-9926-0f1418a4e741"). InnerVolumeSpecName "kube-api-access-qb7ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:00:06 crc kubenswrapper[4919]: I0310 23:00:06.799634 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb7ll\" (UniqueName: \"kubernetes.io/projected/53394bb5-86c2-4638-9926-0f1418a4e741-kube-api-access-qb7ll\") on node \"crc\" DevicePath \"\"" Mar 10 23:00:07 crc kubenswrapper[4919]: I0310 23:00:07.261562 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553060-jbjdx" event={"ID":"53394bb5-86c2-4638-9926-0f1418a4e741","Type":"ContainerDied","Data":"5b67e79a9a53bf4180d743932f3af9c20f29afd76ccdb010d81fbc5a24bedb51"} Mar 10 23:00:07 crc kubenswrapper[4919]: I0310 23:00:07.261606 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b67e79a9a53bf4180d743932f3af9c20f29afd76ccdb010d81fbc5a24bedb51" Mar 10 23:00:07 crc kubenswrapper[4919]: I0310 23:00:07.261614 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553060-jbjdx" Mar 10 23:00:07 crc kubenswrapper[4919]: I0310 23:00:07.597543 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553054-wlswh"] Mar 10 23:00:07 crc kubenswrapper[4919]: I0310 23:00:07.602095 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553054-wlswh"] Mar 10 23:00:09 crc kubenswrapper[4919]: I0310 23:00:09.491453 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94" path="/var/lib/kubelet/pods/464ad8aa-5fb0-4bc1-b1a6-4359ebefdd94/volumes" Mar 10 23:00:29 crc kubenswrapper[4919]: I0310 23:00:29.176035 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:00:29 crc kubenswrapper[4919]: I0310 23:00:29.176714 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:00:45 crc kubenswrapper[4919]: I0310 23:00:45.690671 4919 scope.go:117] "RemoveContainer" containerID="a101c51ebe071dbd405d100b6da60a94012163e27c4b1d8bedd2ce8853daf0f1" Mar 10 23:00:45 crc kubenswrapper[4919]: I0310 23:00:45.757180 4919 scope.go:117] "RemoveContainer" containerID="3b8bb041798f51d982f4bcd80e3c1947b5868b1f2dc277d0ff8d6ecfa13d1c9c" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.117162 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dwd8l"] Mar 10 23:00:50 crc kubenswrapper[4919]: E0310 23:00:50.117991 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53394bb5-86c2-4638-9926-0f1418a4e741" containerName="oc" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.118002 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="53394bb5-86c2-4638-9926-0f1418a4e741" containerName="oc" Mar 10 23:00:50 crc kubenswrapper[4919]: E0310 23:00:50.118022 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31be40f-53ab-4748-b8df-3aa93593e3b5" containerName="collect-profiles" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.118029 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31be40f-53ab-4748-b8df-3aa93593e3b5" containerName="collect-profiles" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.118156 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="53394bb5-86c2-4638-9926-0f1418a4e741" containerName="oc" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.118178 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31be40f-53ab-4748-b8df-3aa93593e3b5" containerName="collect-profiles" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.119071 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.135196 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dwd8l"] Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.172288 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-utilities\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.172589 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf89q\" (UniqueName: \"kubernetes.io/projected/123e1079-cdc1-44f9-9174-853c14d0f9bf-kube-api-access-kf89q\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.172746 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-catalog-content\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.273674 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-catalog-content\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.273781 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-utilities\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.273802 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf89q\" (UniqueName: \"kubernetes.io/projected/123e1079-cdc1-44f9-9174-853c14d0f9bf-kube-api-access-kf89q\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.274289 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-catalog-content\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.274311 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-utilities\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.296197 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf89q\" (UniqueName: \"kubernetes.io/projected/123e1079-cdc1-44f9-9174-853c14d0f9bf-kube-api-access-kf89q\") pod \"community-operators-dwd8l\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.440073 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:00:50 crc kubenswrapper[4919]: I0310 23:00:50.734190 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dwd8l"] Mar 10 23:00:51 crc kubenswrapper[4919]: I0310 23:00:51.610610 4919 generic.go:334] "Generic (PLEG): container finished" podID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerID="424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996" exitCode=0 Mar 10 23:00:51 crc kubenswrapper[4919]: I0310 23:00:51.610724 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwd8l" event={"ID":"123e1079-cdc1-44f9-9174-853c14d0f9bf","Type":"ContainerDied","Data":"424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996"} Mar 10 23:00:51 crc kubenswrapper[4919]: I0310 23:00:51.611050 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwd8l" event={"ID":"123e1079-cdc1-44f9-9174-853c14d0f9bf","Type":"ContainerStarted","Data":"27907d6d53659124f59b9680dcfbf72f36aa7c3f8ad38b76d887f81ec00b1257"} Mar 10 23:00:52 crc kubenswrapper[4919]: I0310 23:00:52.619090 4919 generic.go:334] "Generic (PLEG): container finished" podID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerID="ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea" exitCode=0 Mar 10 23:00:52 crc kubenswrapper[4919]: I0310 23:00:52.619210 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwd8l" event={"ID":"123e1079-cdc1-44f9-9174-853c14d0f9bf","Type":"ContainerDied","Data":"ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea"} Mar 10 23:00:53 crc kubenswrapper[4919]: I0310 23:00:53.627881 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwd8l" event={"ID":"123e1079-cdc1-44f9-9174-853c14d0f9bf","Type":"ContainerStarted","Data":"e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4"} Mar 10 23:00:53 crc kubenswrapper[4919]: I0310 23:00:53.649495 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dwd8l" podStartSLOduration=2.221607931 podStartE2EDuration="3.649472235s" podCreationTimestamp="2026-03-10 23:00:50 +0000 UTC" firstStartedPulling="2026-03-10 23:00:51.612266188 +0000 UTC m=+4238.854146806" lastFinishedPulling="2026-03-10 23:00:53.040130482 +0000 UTC m=+4240.282011110" observedRunningTime="2026-03-10 23:00:53.642953878 +0000 UTC m=+4240.884834486" watchObservedRunningTime="2026-03-10 23:00:53.649472235 +0000 UTC m=+4240.891352833" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.175655 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.176080 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.497027 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vjncc"] Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.498757 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.514533 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjncc"] Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.601827 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-catalog-content\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.602291 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-utilities\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.602354 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9kr5\" (UniqueName: \"kubernetes.io/projected/ea6fa48c-782d-4415-9e53-2c01a949fb77-kube-api-access-q9kr5\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.703554 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-catalog-content\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.703637 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-utilities\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.703658 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9kr5\" (UniqueName: \"kubernetes.io/projected/ea6fa48c-782d-4415-9e53-2c01a949fb77-kube-api-access-q9kr5\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.704135 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-catalog-content\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.704414 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-utilities\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.725603 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9kr5\" (UniqueName: \"kubernetes.io/projected/ea6fa48c-782d-4415-9e53-2c01a949fb77-kube-api-access-q9kr5\") pod \"redhat-marketplace-vjncc\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:00:59 crc kubenswrapper[4919]: I0310 23:00:59.816659 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:01:00 crc kubenswrapper[4919]: I0310 23:01:00.258063 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjncc"] Mar 10 23:01:00 crc kubenswrapper[4919]: I0310 23:01:00.440980 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:01:00 crc kubenswrapper[4919]: I0310 23:01:00.441090 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:01:00 crc kubenswrapper[4919]: I0310 23:01:00.487937 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:01:00 crc kubenswrapper[4919]: I0310 23:01:00.672934 4919 generic.go:334] "Generic (PLEG): container finished" podID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerID="019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e" exitCode=0 Mar 10 23:01:00 crc kubenswrapper[4919]: I0310 23:01:00.673040 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjncc" event={"ID":"ea6fa48c-782d-4415-9e53-2c01a949fb77","Type":"ContainerDied","Data":"019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e"} Mar 10 23:01:00 crc kubenswrapper[4919]: I0310 23:01:00.673313 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjncc" event={"ID":"ea6fa48c-782d-4415-9e53-2c01a949fb77","Type":"ContainerStarted","Data":"76fc93100d2b5665a63c9b3c437144513efee082c5e6c2469bc4f2f0e8ed3702"} Mar 10 23:01:00 crc kubenswrapper[4919]: I0310 23:01:00.736755 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:01:01 crc kubenswrapper[4919]: I0310 23:01:01.691279 4919 generic.go:334] "Generic (PLEG): container finished" podID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerID="3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f" exitCode=0 Mar 10 23:01:01 crc kubenswrapper[4919]: I0310 23:01:01.691361 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjncc" event={"ID":"ea6fa48c-782d-4415-9e53-2c01a949fb77","Type":"ContainerDied","Data":"3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f"} Mar 10 23:01:02 crc kubenswrapper[4919]: I0310 23:01:02.705228 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjncc" event={"ID":"ea6fa48c-782d-4415-9e53-2c01a949fb77","Type":"ContainerStarted","Data":"b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0"} Mar 10 23:01:02 crc kubenswrapper[4919]: I0310 23:01:02.726891 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vjncc" podStartSLOduration=2.283095044 podStartE2EDuration="3.7268686s" podCreationTimestamp="2026-03-10 23:00:59 +0000 UTC" firstStartedPulling="2026-03-10 23:01:00.674864751 +0000 UTC m=+4247.916745359" lastFinishedPulling="2026-03-10 23:01:02.118638307 +0000 UTC m=+4249.360518915" observedRunningTime="2026-03-10 23:01:02.723797197 +0000 UTC m=+4249.965677845" watchObservedRunningTime="2026-03-10 23:01:02.7268686 +0000 UTC m=+4249.968749218" Mar 10 23:01:02 crc kubenswrapper[4919]: I0310 23:01:02.876112 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dwd8l"] Mar 10 23:01:02 crc kubenswrapper[4919]: I0310 23:01:02.876657 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dwd8l" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerName="registry-server" containerID="cri-o://e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4" gracePeriod=2 Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.245657 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.353780 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf89q\" (UniqueName: \"kubernetes.io/projected/123e1079-cdc1-44f9-9174-853c14d0f9bf-kube-api-access-kf89q\") pod \"123e1079-cdc1-44f9-9174-853c14d0f9bf\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.353857 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-catalog-content\") pod \"123e1079-cdc1-44f9-9174-853c14d0f9bf\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.353900 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-utilities\") pod \"123e1079-cdc1-44f9-9174-853c14d0f9bf\" (UID: \"123e1079-cdc1-44f9-9174-853c14d0f9bf\") " Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.354835 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-utilities" (OuterVolumeSpecName: "utilities") pod "123e1079-cdc1-44f9-9174-853c14d0f9bf" (UID: "123e1079-cdc1-44f9-9174-853c14d0f9bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.358625 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/123e1079-cdc1-44f9-9174-853c14d0f9bf-kube-api-access-kf89q" (OuterVolumeSpecName: "kube-api-access-kf89q") pod "123e1079-cdc1-44f9-9174-853c14d0f9bf" (UID: "123e1079-cdc1-44f9-9174-853c14d0f9bf"). InnerVolumeSpecName "kube-api-access-kf89q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.419419 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "123e1079-cdc1-44f9-9174-853c14d0f9bf" (UID: "123e1079-cdc1-44f9-9174-853c14d0f9bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.455464 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf89q\" (UniqueName: \"kubernetes.io/projected/123e1079-cdc1-44f9-9174-853c14d0f9bf-kube-api-access-kf89q\") on node \"crc\" DevicePath \"\"" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.455740 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.455753 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123e1079-cdc1-44f9-9174-853c14d0f9bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.713118 4919 generic.go:334] "Generic (PLEG): container finished" podID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerID="e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4" exitCode=0 Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.713295 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwd8l" event={"ID":"123e1079-cdc1-44f9-9174-853c14d0f9bf","Type":"ContainerDied","Data":"e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4"} Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.713341 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwd8l" event={"ID":"123e1079-cdc1-44f9-9174-853c14d0f9bf","Type":"ContainerDied","Data":"27907d6d53659124f59b9680dcfbf72f36aa7c3f8ad38b76d887f81ec00b1257"} Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.713359 4919 scope.go:117] "RemoveContainer" containerID="e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.714414 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwd8l" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.731534 4919 scope.go:117] "RemoveContainer" containerID="ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.739379 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dwd8l"] Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.747051 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dwd8l"] Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.789515 4919 scope.go:117] "RemoveContainer" containerID="424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.816024 4919 scope.go:117] "RemoveContainer" containerID="e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4" Mar 10 23:01:03 crc kubenswrapper[4919]: E0310 23:01:03.816535 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4\": container with ID starting with e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4 not found: ID does not exist" containerID="e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.816585 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4"} err="failed to get container status \"e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4\": rpc error: code = NotFound desc = could not find container \"e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4\": container with ID starting with e367d22e113885c2a3f7b735e1ecef6d2c97c502ff79b5ad639118d7561438e4 not found: ID does not exist" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.816616 4919 scope.go:117] "RemoveContainer" containerID="ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea" Mar 10 23:01:03 crc kubenswrapper[4919]: E0310 23:01:03.817067 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea\": container with ID starting with ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea not found: ID does not exist" containerID="ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.817190 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea"} err="failed to get container status \"ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea\": rpc error: code = NotFound desc = could not find container \"ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea\": container with ID starting with ea5b6279bc9d81bc970ecf9c22bfcbd8b0ccfb19019d6a962634fcf3c700a3ea not found: ID does not exist" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.817278 4919 scope.go:117] "RemoveContainer" containerID="424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996" Mar 10 23:01:03 crc kubenswrapper[4919]: E0310 23:01:03.817745 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996\": container with ID starting with 424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996 not found: ID does not exist" containerID="424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996" Mar 10 23:01:03 crc kubenswrapper[4919]: I0310 23:01:03.817850 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996"} err="failed to get container status \"424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996\": rpc error: code = NotFound desc = could not find container \"424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996\": container with ID starting with 424aa46b25d88c206a0b7f259188d8c70bb4a22603092403984c99b9506dd996 not found: ID does not exist" Mar 10 23:01:05 crc kubenswrapper[4919]: I0310 23:01:05.492033 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" path="/var/lib/kubelet/pods/123e1079-cdc1-44f9-9174-853c14d0f9bf/volumes" Mar 10 23:01:09 crc kubenswrapper[4919]: I0310 23:01:09.816895 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:01:09 crc kubenswrapper[4919]: I0310 23:01:09.817446 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:01:09 crc kubenswrapper[4919]: I0310 23:01:09.905898 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:01:10 crc kubenswrapper[4919]: I0310 23:01:10.803015 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:01:10 crc kubenswrapper[4919]: I0310 23:01:10.850846 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjncc"] Mar 10 23:01:12 crc kubenswrapper[4919]: I0310 23:01:12.780267 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vjncc" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerName="registry-server" containerID="cri-o://b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0" gracePeriod=2 Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.230201 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.301884 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-catalog-content\") pod \"ea6fa48c-782d-4415-9e53-2c01a949fb77\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.302062 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9kr5\" (UniqueName: \"kubernetes.io/projected/ea6fa48c-782d-4415-9e53-2c01a949fb77-kube-api-access-q9kr5\") pod \"ea6fa48c-782d-4415-9e53-2c01a949fb77\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.302085 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-utilities\") pod \"ea6fa48c-782d-4415-9e53-2c01a949fb77\" (UID: \"ea6fa48c-782d-4415-9e53-2c01a949fb77\") " Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.302946 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-utilities" (OuterVolumeSpecName: "utilities") pod "ea6fa48c-782d-4415-9e53-2c01a949fb77" (UID: "ea6fa48c-782d-4415-9e53-2c01a949fb77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.309023 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6fa48c-782d-4415-9e53-2c01a949fb77-kube-api-access-q9kr5" (OuterVolumeSpecName: "kube-api-access-q9kr5") pod "ea6fa48c-782d-4415-9e53-2c01a949fb77" (UID: "ea6fa48c-782d-4415-9e53-2c01a949fb77"). InnerVolumeSpecName "kube-api-access-q9kr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.330135 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea6fa48c-782d-4415-9e53-2c01a949fb77" (UID: "ea6fa48c-782d-4415-9e53-2c01a949fb77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.403912 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9kr5\" (UniqueName: \"kubernetes.io/projected/ea6fa48c-782d-4415-9e53-2c01a949fb77-kube-api-access-q9kr5\") on node \"crc\" DevicePath \"\"" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.403958 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.403971 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea6fa48c-782d-4415-9e53-2c01a949fb77-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.787736 4919 generic.go:334] "Generic (PLEG): container finished" podID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerID="b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0" exitCode=0 Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.787776 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjncc" event={"ID":"ea6fa48c-782d-4415-9e53-2c01a949fb77","Type":"ContainerDied","Data":"b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0"} Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.787802 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vjncc" event={"ID":"ea6fa48c-782d-4415-9e53-2c01a949fb77","Type":"ContainerDied","Data":"76fc93100d2b5665a63c9b3c437144513efee082c5e6c2469bc4f2f0e8ed3702"} Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.787819 4919 scope.go:117] "RemoveContainer" containerID="b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.787813 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vjncc" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.818251 4919 scope.go:117] "RemoveContainer" containerID="3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.819077 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjncc"] Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.824332 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vjncc"] Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.839083 4919 scope.go:117] "RemoveContainer" containerID="019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.881618 4919 scope.go:117] "RemoveContainer" containerID="b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0" Mar 10 23:01:13 crc kubenswrapper[4919]: E0310 23:01:13.881982 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0\": container with ID starting with b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0 not found: ID does not exist" containerID="b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.882022 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0"} err="failed to get container status \"b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0\": rpc error: code = NotFound desc = could not find container \"b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0\": container with ID starting with b0af8ca14b3837be7bce4b84c58d2151925162b389ae71ff88b1fcb0af680ac0 not found: ID does not exist" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.882047 4919 scope.go:117] "RemoveContainer" containerID="3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f" Mar 10 23:01:13 crc kubenswrapper[4919]: E0310 23:01:13.882590 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f\": container with ID starting with 3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f not found: ID does not exist" containerID="3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.882620 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f"} err="failed to get container status \"3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f\": rpc error: code = NotFound desc = could not find container \"3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f\": container with ID starting with 3706206afb9da575ea85547aa701b125d197b199deb501f7108d95a21865a29f not found: ID does not exist" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.882637 4919 scope.go:117] "RemoveContainer" containerID="019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e" Mar 10 23:01:13 crc kubenswrapper[4919]: E0310 23:01:13.882873 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e\": container with ID starting with 019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e not found: ID does not exist" containerID="019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e" Mar 10 23:01:13 crc kubenswrapper[4919]: I0310 23:01:13.882907 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e"} err="failed to get container status \"019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e\": rpc error: code = NotFound desc = could not find container \"019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e\": container with ID starting with 019a56367c96a0feceeda1893292ff029b3a42c96d76e108aeaea58b340abd0e not found: ID does not exist" Mar 10 23:01:15 crc kubenswrapper[4919]: I0310 23:01:15.487450 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" path="/var/lib/kubelet/pods/ea6fa48c-782d-4415-9e53-2c01a949fb77/volumes" Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.175486 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.176177 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.176253 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.177659 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.177854 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" gracePeriod=600 Mar 10 23:01:29 crc kubenswrapper[4919]: E0310 23:01:29.316120 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.904901 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" exitCode=0 Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.905141 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6"} Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.905726 4919 scope.go:117] "RemoveContainer" containerID="53e8434ecb83f239bb7e4814454c11ba1c0448d2d6ba0068e5ca799c07cd5409" Mar 10 23:01:29 crc kubenswrapper[4919]: I0310 23:01:29.906770 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:01:29 crc kubenswrapper[4919]: E0310 23:01:29.907276 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:01:44 crc kubenswrapper[4919]: I0310 23:01:44.481109 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:01:44 crc kubenswrapper[4919]: E0310 23:01:44.481583 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:01:57 crc kubenswrapper[4919]: I0310 23:01:57.480655 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:01:57 crc kubenswrapper[4919]: E0310 23:01:57.481717 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.144653 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553062-nvgz2"] Mar 10 23:02:00 crc kubenswrapper[4919]: E0310 23:02:00.145331 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerName="registry-server" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.145700 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerName="registry-server" Mar 10 23:02:00 crc kubenswrapper[4919]: E0310 23:02:00.145727 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerName="extract-content" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.145738 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerName="extract-content" Mar 10 23:02:00 crc kubenswrapper[4919]: E0310 23:02:00.145752 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerName="extract-utilities" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.145764 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerName="extract-utilities" Mar 10 23:02:00 crc kubenswrapper[4919]: E0310 23:02:00.145791 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerName="extract-utilities" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.145802 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerName="extract-utilities" Mar 10 23:02:00 crc kubenswrapper[4919]: E0310 23:02:00.145835 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerName="extract-content" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.145845 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerName="extract-content" Mar 10 23:02:00 crc kubenswrapper[4919]: E0310 23:02:00.145867 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerName="registry-server" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.145878 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerName="registry-server" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.146070 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6fa48c-782d-4415-9e53-2c01a949fb77" containerName="registry-server" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.146093 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="123e1079-cdc1-44f9-9174-853c14d0f9bf" containerName="registry-server" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.146704 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.149228 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.149656 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.149882 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.152820 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553062-nvgz2"] Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.227534 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2kr6\" (UniqueName: \"kubernetes.io/projected/69874376-749b-4152-ac36-9ea6f1aba654-kube-api-access-b2kr6\") pod \"auto-csr-approver-29553062-nvgz2\" (UID: \"69874376-749b-4152-ac36-9ea6f1aba654\") " pod="openshift-infra/auto-csr-approver-29553062-nvgz2" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.329131 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2kr6\" (UniqueName: \"kubernetes.io/projected/69874376-749b-4152-ac36-9ea6f1aba654-kube-api-access-b2kr6\") pod \"auto-csr-approver-29553062-nvgz2\" (UID: \"69874376-749b-4152-ac36-9ea6f1aba654\") " pod="openshift-infra/auto-csr-approver-29553062-nvgz2" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.347259 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2kr6\" (UniqueName: \"kubernetes.io/projected/69874376-749b-4152-ac36-9ea6f1aba654-kube-api-access-b2kr6\") pod \"auto-csr-approver-29553062-nvgz2\" (UID: \"69874376-749b-4152-ac36-9ea6f1aba654\") " pod="openshift-infra/auto-csr-approver-29553062-nvgz2" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.466889 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" Mar 10 23:02:00 crc kubenswrapper[4919]: I0310 23:02:00.875622 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553062-nvgz2"] Mar 10 23:02:01 crc kubenswrapper[4919]: I0310 23:02:01.189667 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" event={"ID":"69874376-749b-4152-ac36-9ea6f1aba654","Type":"ContainerStarted","Data":"190c4f3b92244da96d1ff4e0bd24089fca8d48b9916ae9442c4ec6861098b65a"} Mar 10 23:02:02 crc kubenswrapper[4919]: I0310 23:02:02.195965 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" event={"ID":"69874376-749b-4152-ac36-9ea6f1aba654","Type":"ContainerStarted","Data":"922c427fdbc7bb4c150df35b50714bcd3ce5ec47fb82b25c0ddabe636da29b5f"} Mar 10 23:02:02 crc kubenswrapper[4919]: I0310 23:02:02.209778 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" podStartSLOduration=1.258279946 podStartE2EDuration="2.209763837s" podCreationTimestamp="2026-03-10 23:02:00 +0000 UTC" firstStartedPulling="2026-03-10 23:02:00.884484878 +0000 UTC m=+4308.126365486" lastFinishedPulling="2026-03-10 23:02:01.835968769 +0000 UTC m=+4309.077849377" observedRunningTime="2026-03-10 23:02:02.206686434 +0000 UTC m=+4309.448567042" watchObservedRunningTime="2026-03-10 23:02:02.209763837 +0000 UTC m=+4309.451644445" Mar 10 23:02:03 crc kubenswrapper[4919]: I0310 23:02:03.206102 4919 generic.go:334] "Generic (PLEG): container finished" podID="69874376-749b-4152-ac36-9ea6f1aba654" containerID="922c427fdbc7bb4c150df35b50714bcd3ce5ec47fb82b25c0ddabe636da29b5f" exitCode=0 Mar 10 23:02:03 crc kubenswrapper[4919]: I0310 23:02:03.206636 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" event={"ID":"69874376-749b-4152-ac36-9ea6f1aba654","Type":"ContainerDied","Data":"922c427fdbc7bb4c150df35b50714bcd3ce5ec47fb82b25c0ddabe636da29b5f"} Mar 10 23:02:04 crc kubenswrapper[4919]: I0310 23:02:04.556525 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" Mar 10 23:02:04 crc kubenswrapper[4919]: I0310 23:02:04.592249 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2kr6\" (UniqueName: \"kubernetes.io/projected/69874376-749b-4152-ac36-9ea6f1aba654-kube-api-access-b2kr6\") pod \"69874376-749b-4152-ac36-9ea6f1aba654\" (UID: \"69874376-749b-4152-ac36-9ea6f1aba654\") " Mar 10 23:02:04 crc kubenswrapper[4919]: I0310 23:02:04.597088 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69874376-749b-4152-ac36-9ea6f1aba654-kube-api-access-b2kr6" (OuterVolumeSpecName: "kube-api-access-b2kr6") pod "69874376-749b-4152-ac36-9ea6f1aba654" (UID: "69874376-749b-4152-ac36-9ea6f1aba654"). InnerVolumeSpecName "kube-api-access-b2kr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:02:04 crc kubenswrapper[4919]: I0310 23:02:04.694793 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2kr6\" (UniqueName: \"kubernetes.io/projected/69874376-749b-4152-ac36-9ea6f1aba654-kube-api-access-b2kr6\") on node \"crc\" DevicePath \"\"" Mar 10 23:02:05 crc kubenswrapper[4919]: I0310 23:02:05.228374 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" event={"ID":"69874376-749b-4152-ac36-9ea6f1aba654","Type":"ContainerDied","Data":"190c4f3b92244da96d1ff4e0bd24089fca8d48b9916ae9442c4ec6861098b65a"} Mar 10 23:02:05 crc kubenswrapper[4919]: I0310 23:02:05.228797 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="190c4f3b92244da96d1ff4e0bd24089fca8d48b9916ae9442c4ec6861098b65a" Mar 10 23:02:05 crc kubenswrapper[4919]: I0310 23:02:05.228527 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553062-nvgz2" Mar 10 23:02:05 crc kubenswrapper[4919]: I0310 23:02:05.300221 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553056-mxpkl"] Mar 10 23:02:05 crc kubenswrapper[4919]: I0310 23:02:05.309595 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553056-mxpkl"] Mar 10 23:02:05 crc kubenswrapper[4919]: I0310 23:02:05.495863 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9275d8f0-0612-4db0-8167-d891ba7efb59" path="/var/lib/kubelet/pods/9275d8f0-0612-4db0-8167-d891ba7efb59/volumes" Mar 10 23:02:08 crc kubenswrapper[4919]: I0310 23:02:08.481537 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:02:08 crc kubenswrapper[4919]: E0310 23:02:08.482450 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:02:19 crc kubenswrapper[4919]: I0310 23:02:19.481145 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:02:19 crc kubenswrapper[4919]: E0310 23:02:19.482023 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:02:31 crc kubenswrapper[4919]: I0310 23:02:31.480265 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:02:31 crc kubenswrapper[4919]: E0310 23:02:31.481183 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:02:43 crc kubenswrapper[4919]: I0310 23:02:43.487233 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:02:43 crc kubenswrapper[4919]: E0310 23:02:43.488351 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:02:45 crc kubenswrapper[4919]: I0310 23:02:45.844356 4919 scope.go:117] "RemoveContainer" containerID="710ded4218c98dbee0a107c0bad53217eb0af727bc1e9d9b06f510dbc2f03318" Mar 10 23:02:54 crc kubenswrapper[4919]: I0310 23:02:54.481600 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:02:54 crc kubenswrapper[4919]: E0310 23:02:54.482646 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:03:09 crc kubenswrapper[4919]: I0310 23:03:09.480518 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:03:09 crc kubenswrapper[4919]: E0310 23:03:09.481431 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:03:22 crc kubenswrapper[4919]: I0310 23:03:22.480194 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:03:22 crc kubenswrapper[4919]: E0310 23:03:22.482347 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:03:33 crc kubenswrapper[4919]: I0310 23:03:33.490227 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:03:33 crc kubenswrapper[4919]: E0310 23:03:33.491756 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:03:46 crc kubenswrapper[4919]: I0310 23:03:46.480899 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:03:46 crc kubenswrapper[4919]: E0310 23:03:46.481716 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.141368 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553064-mshxx"] Mar 10 23:04:00 crc kubenswrapper[4919]: E0310 23:04:00.142305 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69874376-749b-4152-ac36-9ea6f1aba654" containerName="oc" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.142325 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="69874376-749b-4152-ac36-9ea6f1aba654" containerName="oc" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.142693 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="69874376-749b-4152-ac36-9ea6f1aba654" containerName="oc" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.143374 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553064-mshxx" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.145774 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.146456 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.146794 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.154258 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553064-mshxx"] Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.320876 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r22l5\" (UniqueName: \"kubernetes.io/projected/e9f277f7-0813-4203-bf70-9fb67673689e-kube-api-access-r22l5\") pod \"auto-csr-approver-29553064-mshxx\" (UID: \"e9f277f7-0813-4203-bf70-9fb67673689e\") " pod="openshift-infra/auto-csr-approver-29553064-mshxx" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.422485 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r22l5\" (UniqueName: \"kubernetes.io/projected/e9f277f7-0813-4203-bf70-9fb67673689e-kube-api-access-r22l5\") pod \"auto-csr-approver-29553064-mshxx\" (UID: \"e9f277f7-0813-4203-bf70-9fb67673689e\") " pod="openshift-infra/auto-csr-approver-29553064-mshxx" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.457337 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r22l5\" (UniqueName: \"kubernetes.io/projected/e9f277f7-0813-4203-bf70-9fb67673689e-kube-api-access-r22l5\") pod \"auto-csr-approver-29553064-mshxx\" (UID: \"e9f277f7-0813-4203-bf70-9fb67673689e\") " pod="openshift-infra/auto-csr-approver-29553064-mshxx" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.474721 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553064-mshxx" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.481081 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:04:00 crc kubenswrapper[4919]: E0310 23:04:00.481603 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:04:00 crc kubenswrapper[4919]: I0310 23:04:00.986647 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553064-mshxx"] Mar 10 23:04:01 crc kubenswrapper[4919]: W0310 23:04:01.001609 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9f277f7_0813_4203_bf70_9fb67673689e.slice/crio-bd986fe795d3b030b04c341dd069901bdca00f2189baac7c0c3d2b603c3c316a WatchSource:0}: Error finding container bd986fe795d3b030b04c341dd069901bdca00f2189baac7c0c3d2b603c3c316a: Status 404 returned error can't find the container with id bd986fe795d3b030b04c341dd069901bdca00f2189baac7c0c3d2b603c3c316a Mar 10 23:04:01 crc kubenswrapper[4919]: I0310 23:04:01.005897 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 23:04:01 crc kubenswrapper[4919]: I0310 23:04:01.363751 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553064-mshxx" event={"ID":"e9f277f7-0813-4203-bf70-9fb67673689e","Type":"ContainerStarted","Data":"bd986fe795d3b030b04c341dd069901bdca00f2189baac7c0c3d2b603c3c316a"} Mar 10 23:04:02 crc kubenswrapper[4919]: I0310 23:04:02.370942 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553064-mshxx" event={"ID":"e9f277f7-0813-4203-bf70-9fb67673689e","Type":"ContainerStarted","Data":"3a3a38403cb29e1ea22e4e0acac1bc16bacebcf8057d3c9122ea156f5aac2452"} Mar 10 23:04:02 crc kubenswrapper[4919]: I0310 23:04:02.387620 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553064-mshxx" podStartSLOduration=1.388172016 podStartE2EDuration="2.387597358s" podCreationTimestamp="2026-03-10 23:04:00 +0000 UTC" firstStartedPulling="2026-03-10 23:04:01.005716912 +0000 UTC m=+4428.247597520" lastFinishedPulling="2026-03-10 23:04:02.005142254 +0000 UTC m=+4429.247022862" observedRunningTime="2026-03-10 23:04:02.384202226 +0000 UTC m=+4429.626082834" watchObservedRunningTime="2026-03-10 23:04:02.387597358 +0000 UTC m=+4429.629477966" Mar 10 23:04:03 crc kubenswrapper[4919]: I0310 23:04:03.378261 4919 generic.go:334] "Generic (PLEG): container finished" podID="e9f277f7-0813-4203-bf70-9fb67673689e" containerID="3a3a38403cb29e1ea22e4e0acac1bc16bacebcf8057d3c9122ea156f5aac2452" exitCode=0 Mar 10 23:04:03 crc kubenswrapper[4919]: I0310 23:04:03.378306 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553064-mshxx" event={"ID":"e9f277f7-0813-4203-bf70-9fb67673689e","Type":"ContainerDied","Data":"3a3a38403cb29e1ea22e4e0acac1bc16bacebcf8057d3c9122ea156f5aac2452"} Mar 10 23:04:04 crc kubenswrapper[4919]: I0310 23:04:04.650124 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553064-mshxx" Mar 10 23:04:04 crc kubenswrapper[4919]: I0310 23:04:04.781664 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r22l5\" (UniqueName: \"kubernetes.io/projected/e9f277f7-0813-4203-bf70-9fb67673689e-kube-api-access-r22l5\") pod \"e9f277f7-0813-4203-bf70-9fb67673689e\" (UID: \"e9f277f7-0813-4203-bf70-9fb67673689e\") " Mar 10 23:04:04 crc kubenswrapper[4919]: I0310 23:04:04.787129 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f277f7-0813-4203-bf70-9fb67673689e-kube-api-access-r22l5" (OuterVolumeSpecName: "kube-api-access-r22l5") pod "e9f277f7-0813-4203-bf70-9fb67673689e" (UID: "e9f277f7-0813-4203-bf70-9fb67673689e"). InnerVolumeSpecName "kube-api-access-r22l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:04:04 crc kubenswrapper[4919]: I0310 23:04:04.883081 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r22l5\" (UniqueName: \"kubernetes.io/projected/e9f277f7-0813-4203-bf70-9fb67673689e-kube-api-access-r22l5\") on node \"crc\" DevicePath \"\"" Mar 10 23:04:05 crc kubenswrapper[4919]: I0310 23:04:05.390644 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553064-mshxx" event={"ID":"e9f277f7-0813-4203-bf70-9fb67673689e","Type":"ContainerDied","Data":"bd986fe795d3b030b04c341dd069901bdca00f2189baac7c0c3d2b603c3c316a"} Mar 10 23:04:05 crc kubenswrapper[4919]: I0310 23:04:05.390687 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd986fe795d3b030b04c341dd069901bdca00f2189baac7c0c3d2b603c3c316a" Mar 10 23:04:05 crc kubenswrapper[4919]: I0310 23:04:05.390667 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553064-mshxx" Mar 10 23:04:05 crc kubenswrapper[4919]: I0310 23:04:05.449858 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553058-f7p8l"] Mar 10 23:04:05 crc kubenswrapper[4919]: I0310 23:04:05.454918 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553058-f7p8l"] Mar 10 23:04:05 crc kubenswrapper[4919]: I0310 23:04:05.487815 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365d2b3f-3786-49df-937f-b35e69aca426" path="/var/lib/kubelet/pods/365d2b3f-3786-49df-937f-b35e69aca426/volumes" Mar 10 23:04:11 crc kubenswrapper[4919]: I0310 23:04:11.479409 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:04:11 crc kubenswrapper[4919]: E0310 23:04:11.480161 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:04:26 crc kubenswrapper[4919]: I0310 23:04:26.480408 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:04:26 crc kubenswrapper[4919]: E0310 23:04:26.481598 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:04:40 crc kubenswrapper[4919]: I0310 23:04:40.480681 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:04:40 crc kubenswrapper[4919]: E0310 23:04:40.482574 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:04:45 crc kubenswrapper[4919]: I0310 23:04:45.960492 4919 scope.go:117] "RemoveContainer" containerID="11256ff6e4405d720c8472d5ed25a2f1c54da1f60ee12ae4580e7dc869b895c7" Mar 10 23:04:55 crc kubenswrapper[4919]: I0310 23:04:55.480553 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:04:55 crc kubenswrapper[4919]: E0310 23:04:55.483330 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:05:08 crc kubenswrapper[4919]: I0310 23:05:08.480481 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:05:08 crc kubenswrapper[4919]: E0310 23:05:08.481464 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:05:20 crc kubenswrapper[4919]: I0310 23:05:20.480299 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:05:20 crc kubenswrapper[4919]: E0310 23:05:20.481430 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:05:34 crc kubenswrapper[4919]: I0310 23:05:34.479952 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:05:34 crc kubenswrapper[4919]: E0310 23:05:34.480691 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:05:49 crc kubenswrapper[4919]: I0310 23:05:49.480198 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:05:49 crc kubenswrapper[4919]: E0310 23:05:49.481174 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.143327 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553066-6ks5t"] Mar 10 23:06:00 crc kubenswrapper[4919]: E0310 23:06:00.144177 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f277f7-0813-4203-bf70-9fb67673689e" containerName="oc" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.144191 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f277f7-0813-4203-bf70-9fb67673689e" containerName="oc" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.144371 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f277f7-0813-4203-bf70-9fb67673689e" containerName="oc" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.144898 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553066-6ks5t" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.149475 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.149687 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.149713 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.159784 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553066-6ks5t"] Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.163182 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg9st\" (UniqueName: \"kubernetes.io/projected/e3d2345c-1f62-493a-80e7-3d6db5051332-kube-api-access-dg9st\") pod \"auto-csr-approver-29553066-6ks5t\" (UID: \"e3d2345c-1f62-493a-80e7-3d6db5051332\") " pod="openshift-infra/auto-csr-approver-29553066-6ks5t" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.263752 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg9st\" (UniqueName: \"kubernetes.io/projected/e3d2345c-1f62-493a-80e7-3d6db5051332-kube-api-access-dg9st\") pod \"auto-csr-approver-29553066-6ks5t\" (UID: \"e3d2345c-1f62-493a-80e7-3d6db5051332\") " pod="openshift-infra/auto-csr-approver-29553066-6ks5t" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.292166 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg9st\" (UniqueName: \"kubernetes.io/projected/e3d2345c-1f62-493a-80e7-3d6db5051332-kube-api-access-dg9st\") pod \"auto-csr-approver-29553066-6ks5t\" (UID: \"e3d2345c-1f62-493a-80e7-3d6db5051332\") " pod="openshift-infra/auto-csr-approver-29553066-6ks5t" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.477962 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553066-6ks5t" Mar 10 23:06:00 crc kubenswrapper[4919]: I0310 23:06:00.947470 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553066-6ks5t"] Mar 10 23:06:01 crc kubenswrapper[4919]: I0310 23:06:01.298072 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553066-6ks5t" event={"ID":"e3d2345c-1f62-493a-80e7-3d6db5051332","Type":"ContainerStarted","Data":"2eae5feb457d72e90db5771ccb416d3daa53463cdb017ad80150ce5acbc5132d"} Mar 10 23:06:03 crc kubenswrapper[4919]: I0310 23:06:03.324432 4919 generic.go:334] "Generic (PLEG): container finished" podID="e3d2345c-1f62-493a-80e7-3d6db5051332" containerID="87bb54e220785a904603c96f9e6b9e0a04753b2a85ea87b9d6040acf8421b5b9" exitCode=0 Mar 10 23:06:03 crc kubenswrapper[4919]: I0310 23:06:03.324484 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553066-6ks5t" event={"ID":"e3d2345c-1f62-493a-80e7-3d6db5051332","Type":"ContainerDied","Data":"87bb54e220785a904603c96f9e6b9e0a04753b2a85ea87b9d6040acf8421b5b9"} Mar 10 23:06:03 crc kubenswrapper[4919]: I0310 23:06:03.486070 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:06:03 crc kubenswrapper[4919]: E0310 23:06:03.486493 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:06:04 crc kubenswrapper[4919]: I0310 23:06:04.626961 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553066-6ks5t" Mar 10 23:06:04 crc kubenswrapper[4919]: I0310 23:06:04.824681 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg9st\" (UniqueName: \"kubernetes.io/projected/e3d2345c-1f62-493a-80e7-3d6db5051332-kube-api-access-dg9st\") pod \"e3d2345c-1f62-493a-80e7-3d6db5051332\" (UID: \"e3d2345c-1f62-493a-80e7-3d6db5051332\") " Mar 10 23:06:04 crc kubenswrapper[4919]: I0310 23:06:04.829918 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d2345c-1f62-493a-80e7-3d6db5051332-kube-api-access-dg9st" (OuterVolumeSpecName: "kube-api-access-dg9st") pod "e3d2345c-1f62-493a-80e7-3d6db5051332" (UID: "e3d2345c-1f62-493a-80e7-3d6db5051332"). InnerVolumeSpecName "kube-api-access-dg9st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:06:04 crc kubenswrapper[4919]: I0310 23:06:04.927490 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg9st\" (UniqueName: \"kubernetes.io/projected/e3d2345c-1f62-493a-80e7-3d6db5051332-kube-api-access-dg9st\") on node \"crc\" DevicePath \"\"" Mar 10 23:06:05 crc kubenswrapper[4919]: I0310 23:06:05.340668 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553066-6ks5t" event={"ID":"e3d2345c-1f62-493a-80e7-3d6db5051332","Type":"ContainerDied","Data":"2eae5feb457d72e90db5771ccb416d3daa53463cdb017ad80150ce5acbc5132d"} Mar 10 23:06:05 crc kubenswrapper[4919]: I0310 23:06:05.341009 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eae5feb457d72e90db5771ccb416d3daa53463cdb017ad80150ce5acbc5132d" Mar 10 23:06:05 crc kubenswrapper[4919]: I0310 23:06:05.340767 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553066-6ks5t" Mar 10 23:06:05 crc kubenswrapper[4919]: I0310 23:06:05.445584 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553060-jbjdx"] Mar 10 23:06:05 crc kubenswrapper[4919]: I0310 23:06:05.449993 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553060-jbjdx"] Mar 10 23:06:05 crc kubenswrapper[4919]: I0310 23:06:05.487264 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53394bb5-86c2-4638-9926-0f1418a4e741" path="/var/lib/kubelet/pods/53394bb5-86c2-4638-9926-0f1418a4e741/volumes" Mar 10 23:06:17 crc kubenswrapper[4919]: I0310 23:06:17.480106 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:06:17 crc kubenswrapper[4919]: E0310 23:06:17.482001 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:06:29 crc kubenswrapper[4919]: I0310 23:06:29.480138 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:06:30 crc kubenswrapper[4919]: I0310 23:06:30.570113 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"c77b50dfa88f3a5900cce9090fa8f1f418ddf2416d171d225eb1d604a613c491"} Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.510486 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pvldf"] Mar 10 23:06:39 crc kubenswrapper[4919]: E0310 23:06:39.514808 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d2345c-1f62-493a-80e7-3d6db5051332" containerName="oc" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.514876 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d2345c-1f62-493a-80e7-3d6db5051332" containerName="oc" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.515230 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d2345c-1f62-493a-80e7-3d6db5051332" containerName="oc" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.517014 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.547182 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvldf"] Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.608497 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-catalog-content\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.608585 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-utilities\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.608602 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5mw8\" (UniqueName: \"kubernetes.io/projected/ae79d332-92d8-44cc-a5b3-2d8d31e37728-kube-api-access-f5mw8\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.710280 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-catalog-content\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.710379 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-utilities\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.710426 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5mw8\" (UniqueName: \"kubernetes.io/projected/ae79d332-92d8-44cc-a5b3-2d8d31e37728-kube-api-access-f5mw8\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.710814 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-catalog-content\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.710966 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-utilities\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.742809 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5mw8\" (UniqueName: \"kubernetes.io/projected/ae79d332-92d8-44cc-a5b3-2d8d31e37728-kube-api-access-f5mw8\") pod \"redhat-operators-pvldf\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:39 crc kubenswrapper[4919]: I0310 23:06:39.888661 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:40 crc kubenswrapper[4919]: I0310 23:06:40.623943 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvldf"] Mar 10 23:06:40 crc kubenswrapper[4919]: I0310 23:06:40.645003 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvldf" event={"ID":"ae79d332-92d8-44cc-a5b3-2d8d31e37728","Type":"ContainerStarted","Data":"0a7adff7d50d698f22798d23a66cae0f5fcade23f99fc0f119d69db841d8277e"} Mar 10 23:06:41 crc kubenswrapper[4919]: I0310 23:06:41.659151 4919 generic.go:334] "Generic (PLEG): container finished" podID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerID="7e25fa2a7b953b9c38cfe9eb08e123fcb32cf0bf718d3b8867f5df49fa95eaa2" exitCode=0 Mar 10 23:06:41 crc kubenswrapper[4919]: I0310 23:06:41.659219 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvldf" event={"ID":"ae79d332-92d8-44cc-a5b3-2d8d31e37728","Type":"ContainerDied","Data":"7e25fa2a7b953b9c38cfe9eb08e123fcb32cf0bf718d3b8867f5df49fa95eaa2"} Mar 10 23:06:42 crc kubenswrapper[4919]: I0310 23:06:42.667265 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvldf" event={"ID":"ae79d332-92d8-44cc-a5b3-2d8d31e37728","Type":"ContainerStarted","Data":"65353021a14cfebc8596e8c787effb77fe916a3d46e6a83b02e726a25882f054"} Mar 10 23:06:43 crc kubenswrapper[4919]: I0310 23:06:43.679553 4919 generic.go:334] "Generic (PLEG): container finished" podID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerID="65353021a14cfebc8596e8c787effb77fe916a3d46e6a83b02e726a25882f054" exitCode=0 Mar 10 23:06:43 crc kubenswrapper[4919]: I0310 23:06:43.679600 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvldf" event={"ID":"ae79d332-92d8-44cc-a5b3-2d8d31e37728","Type":"ContainerDied","Data":"65353021a14cfebc8596e8c787effb77fe916a3d46e6a83b02e726a25882f054"} Mar 10 23:06:44 crc kubenswrapper[4919]: I0310 23:06:44.691522 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvldf" event={"ID":"ae79d332-92d8-44cc-a5b3-2d8d31e37728","Type":"ContainerStarted","Data":"b829a75023e63c772a3d4e6e588cdd7014337f5685c1a60051c3612d1d28d9ec"} Mar 10 23:06:44 crc kubenswrapper[4919]: I0310 23:06:44.716066 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pvldf" podStartSLOduration=3.304663008 podStartE2EDuration="5.716050357s" podCreationTimestamp="2026-03-10 23:06:39 +0000 UTC" firstStartedPulling="2026-03-10 23:06:41.662028647 +0000 UTC m=+4588.903909255" lastFinishedPulling="2026-03-10 23:06:44.073415976 +0000 UTC m=+4591.315296604" observedRunningTime="2026-03-10 23:06:44.715610386 +0000 UTC m=+4591.957490994" watchObservedRunningTime="2026-03-10 23:06:44.716050357 +0000 UTC m=+4591.957930965" Mar 10 23:06:46 crc kubenswrapper[4919]: I0310 23:06:46.075809 4919 scope.go:117] "RemoveContainer" containerID="b220d44277abe0c9192aa1f257399bda3df2fe47eed1000ca768116e09c8cfcd" Mar 10 23:06:49 crc kubenswrapper[4919]: I0310 23:06:49.889361 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:49 crc kubenswrapper[4919]: I0310 23:06:49.889969 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:06:50 crc kubenswrapper[4919]: I0310 23:06:50.956539 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pvldf" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="registry-server" probeResult="failure" output=< Mar 10 23:06:50 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 23:06:50 crc kubenswrapper[4919]: > Mar 10 23:07:00 crc kubenswrapper[4919]: I0310 23:07:00.345856 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:07:00 crc kubenswrapper[4919]: I0310 23:07:00.422000 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:07:00 crc kubenswrapper[4919]: I0310 23:07:00.592212 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvldf"] Mar 10 23:07:01 crc kubenswrapper[4919]: I0310 23:07:01.820211 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pvldf" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="registry-server" containerID="cri-o://b829a75023e63c772a3d4e6e588cdd7014337f5685c1a60051c3612d1d28d9ec" gracePeriod=2 Mar 10 23:07:02 crc kubenswrapper[4919]: I0310 23:07:02.830044 4919 generic.go:334] "Generic (PLEG): container finished" podID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerID="b829a75023e63c772a3d4e6e588cdd7014337f5685c1a60051c3612d1d28d9ec" exitCode=0 Mar 10 23:07:02 crc kubenswrapper[4919]: I0310 23:07:02.830235 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvldf" event={"ID":"ae79d332-92d8-44cc-a5b3-2d8d31e37728","Type":"ContainerDied","Data":"b829a75023e63c772a3d4e6e588cdd7014337f5685c1a60051c3612d1d28d9ec"} Mar 10 23:07:02 crc kubenswrapper[4919]: I0310 23:07:02.903146 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.043525 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5mw8\" (UniqueName: \"kubernetes.io/projected/ae79d332-92d8-44cc-a5b3-2d8d31e37728-kube-api-access-f5mw8\") pod \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.043662 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-catalog-content\") pod \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.044671 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-utilities\") pod \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\" (UID: \"ae79d332-92d8-44cc-a5b3-2d8d31e37728\") " Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.045344 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-utilities" (OuterVolumeSpecName: "utilities") pod "ae79d332-92d8-44cc-a5b3-2d8d31e37728" (UID: "ae79d332-92d8-44cc-a5b3-2d8d31e37728"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.048448 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae79d332-92d8-44cc-a5b3-2d8d31e37728-kube-api-access-f5mw8" (OuterVolumeSpecName: "kube-api-access-f5mw8") pod "ae79d332-92d8-44cc-a5b3-2d8d31e37728" (UID: "ae79d332-92d8-44cc-a5b3-2d8d31e37728"). InnerVolumeSpecName "kube-api-access-f5mw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.145893 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.145925 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5mw8\" (UniqueName: \"kubernetes.io/projected/ae79d332-92d8-44cc-a5b3-2d8d31e37728-kube-api-access-f5mw8\") on node \"crc\" DevicePath \"\"" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.187810 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae79d332-92d8-44cc-a5b3-2d8d31e37728" (UID: "ae79d332-92d8-44cc-a5b3-2d8d31e37728"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.246802 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae79d332-92d8-44cc-a5b3-2d8d31e37728-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.842120 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvldf" event={"ID":"ae79d332-92d8-44cc-a5b3-2d8d31e37728","Type":"ContainerDied","Data":"0a7adff7d50d698f22798d23a66cae0f5fcade23f99fc0f119d69db841d8277e"} Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.842185 4919 scope.go:117] "RemoveContainer" containerID="b829a75023e63c772a3d4e6e588cdd7014337f5685c1a60051c3612d1d28d9ec" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.842219 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvldf" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.872595 4919 scope.go:117] "RemoveContainer" containerID="65353021a14cfebc8596e8c787effb77fe916a3d46e6a83b02e726a25882f054" Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.879746 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvldf"] Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.888938 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pvldf"] Mar 10 23:07:03 crc kubenswrapper[4919]: I0310 23:07:03.906956 4919 scope.go:117] "RemoveContainer" containerID="7e25fa2a7b953b9c38cfe9eb08e123fcb32cf0bf718d3b8867f5df49fa95eaa2" Mar 10 23:07:05 crc kubenswrapper[4919]: I0310 23:07:05.491447 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" path="/var/lib/kubelet/pods/ae79d332-92d8-44cc-a5b3-2d8d31e37728/volumes" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.160019 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553068-kxqfj"] Mar 10 23:08:00 crc kubenswrapper[4919]: E0310 23:08:00.161069 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="extract-content" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.161089 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="extract-content" Mar 10 23:08:00 crc kubenswrapper[4919]: E0310 23:08:00.161141 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="registry-server" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.161152 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="registry-server" Mar 10 23:08:00 crc kubenswrapper[4919]: E0310 23:08:00.161170 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="extract-utilities" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.161180 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="extract-utilities" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.161394 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae79d332-92d8-44cc-a5b3-2d8d31e37728" containerName="registry-server" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.162148 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.167585 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.167494 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.168589 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.209810 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw9l4\" (UniqueName: \"kubernetes.io/projected/5104d1b8-7a21-4c03-9fe5-0c1603184314-kube-api-access-qw9l4\") pod \"auto-csr-approver-29553068-kxqfj\" (UID: \"5104d1b8-7a21-4c03-9fe5-0c1603184314\") " pod="openshift-infra/auto-csr-approver-29553068-kxqfj" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.210332 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553068-kxqfj"] Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.311372 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw9l4\" (UniqueName: \"kubernetes.io/projected/5104d1b8-7a21-4c03-9fe5-0c1603184314-kube-api-access-qw9l4\") pod \"auto-csr-approver-29553068-kxqfj\" (UID: \"5104d1b8-7a21-4c03-9fe5-0c1603184314\") " pod="openshift-infra/auto-csr-approver-29553068-kxqfj" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.335433 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw9l4\" (UniqueName: \"kubernetes.io/projected/5104d1b8-7a21-4c03-9fe5-0c1603184314-kube-api-access-qw9l4\") pod \"auto-csr-approver-29553068-kxqfj\" (UID: \"5104d1b8-7a21-4c03-9fe5-0c1603184314\") " pod="openshift-infra/auto-csr-approver-29553068-kxqfj" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.520054 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" Mar 10 23:08:00 crc kubenswrapper[4919]: I0310 23:08:00.947308 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553068-kxqfj"] Mar 10 23:08:01 crc kubenswrapper[4919]: I0310 23:08:01.286020 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" event={"ID":"5104d1b8-7a21-4c03-9fe5-0c1603184314","Type":"ContainerStarted","Data":"46f45a7a3c3ae90cb54201938493b604094aac57def4960ffabd02e74e5fc28b"} Mar 10 23:08:02 crc kubenswrapper[4919]: I0310 23:08:02.294224 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" event={"ID":"5104d1b8-7a21-4c03-9fe5-0c1603184314","Type":"ContainerStarted","Data":"eeca2a7898f2ae11e8e319b59e6ff80256559b0394751354bd8f5e008da20524"} Mar 10 23:08:02 crc kubenswrapper[4919]: I0310 23:08:02.312908 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" podStartSLOduration=1.359190382 podStartE2EDuration="2.31289258s" podCreationTimestamp="2026-03-10 23:08:00 +0000 UTC" firstStartedPulling="2026-03-10 23:08:00.957913517 +0000 UTC m=+4668.199794145" lastFinishedPulling="2026-03-10 23:08:01.911615725 +0000 UTC m=+4669.153496343" observedRunningTime="2026-03-10 23:08:02.310065023 +0000 UTC m=+4669.551945631" watchObservedRunningTime="2026-03-10 23:08:02.31289258 +0000 UTC m=+4669.554773188" Mar 10 23:08:03 crc kubenswrapper[4919]: I0310 23:08:03.332183 4919 generic.go:334] "Generic (PLEG): container finished" podID="5104d1b8-7a21-4c03-9fe5-0c1603184314" containerID="eeca2a7898f2ae11e8e319b59e6ff80256559b0394751354bd8f5e008da20524" exitCode=0 Mar 10 23:08:03 crc kubenswrapper[4919]: I0310 23:08:03.332241 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" event={"ID":"5104d1b8-7a21-4c03-9fe5-0c1603184314","Type":"ContainerDied","Data":"eeca2a7898f2ae11e8e319b59e6ff80256559b0394751354bd8f5e008da20524"} Mar 10 23:08:04 crc kubenswrapper[4919]: I0310 23:08:04.623313 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" Mar 10 23:08:04 crc kubenswrapper[4919]: I0310 23:08:04.671816 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw9l4\" (UniqueName: \"kubernetes.io/projected/5104d1b8-7a21-4c03-9fe5-0c1603184314-kube-api-access-qw9l4\") pod \"5104d1b8-7a21-4c03-9fe5-0c1603184314\" (UID: \"5104d1b8-7a21-4c03-9fe5-0c1603184314\") " Mar 10 23:08:04 crc kubenswrapper[4919]: I0310 23:08:04.677050 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5104d1b8-7a21-4c03-9fe5-0c1603184314-kube-api-access-qw9l4" (OuterVolumeSpecName: "kube-api-access-qw9l4") pod "5104d1b8-7a21-4c03-9fe5-0c1603184314" (UID: "5104d1b8-7a21-4c03-9fe5-0c1603184314"). InnerVolumeSpecName "kube-api-access-qw9l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:08:04 crc kubenswrapper[4919]: I0310 23:08:04.773939 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw9l4\" (UniqueName: \"kubernetes.io/projected/5104d1b8-7a21-4c03-9fe5-0c1603184314-kube-api-access-qw9l4\") on node \"crc\" DevicePath \"\"" Mar 10 23:08:05 crc kubenswrapper[4919]: I0310 23:08:05.354197 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" event={"ID":"5104d1b8-7a21-4c03-9fe5-0c1603184314","Type":"ContainerDied","Data":"46f45a7a3c3ae90cb54201938493b604094aac57def4960ffabd02e74e5fc28b"} Mar 10 23:08:05 crc kubenswrapper[4919]: I0310 23:08:05.354253 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46f45a7a3c3ae90cb54201938493b604094aac57def4960ffabd02e74e5fc28b" Mar 10 23:08:05 crc kubenswrapper[4919]: I0310 23:08:05.354304 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553068-kxqfj" Mar 10 23:08:05 crc kubenswrapper[4919]: I0310 23:08:05.384343 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553062-nvgz2"] Mar 10 23:08:05 crc kubenswrapper[4919]: I0310 23:08:05.391530 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553062-nvgz2"] Mar 10 23:08:05 crc kubenswrapper[4919]: I0310 23:08:05.491786 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69874376-749b-4152-ac36-9ea6f1aba654" path="/var/lib/kubelet/pods/69874376-749b-4152-ac36-9ea6f1aba654/volumes" Mar 10 23:08:29 crc kubenswrapper[4919]: I0310 23:08:29.176061 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:08:29 crc kubenswrapper[4919]: I0310 23:08:29.176826 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:08:46 crc kubenswrapper[4919]: I0310 23:08:46.169331 4919 scope.go:117] "RemoveContainer" containerID="922c427fdbc7bb4c150df35b50714bcd3ce5ec47fb82b25c0ddabe636da29b5f" Mar 10 23:08:59 crc kubenswrapper[4919]: I0310 23:08:59.175608 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:08:59 crc kubenswrapper[4919]: I0310 23:08:59.176163 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:09:29 crc kubenswrapper[4919]: I0310 23:09:29.175799 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:09:29 crc kubenswrapper[4919]: I0310 23:09:29.176596 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:09:29 crc kubenswrapper[4919]: I0310 23:09:29.176676 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:09:29 crc kubenswrapper[4919]: I0310 23:09:29.177583 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c77b50dfa88f3a5900cce9090fa8f1f418ddf2416d171d225eb1d604a613c491"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:09:29 crc kubenswrapper[4919]: I0310 23:09:29.177683 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://c77b50dfa88f3a5900cce9090fa8f1f418ddf2416d171d225eb1d604a613c491" gracePeriod=600 Mar 10 23:09:30 crc kubenswrapper[4919]: I0310 23:09:30.005216 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="c77b50dfa88f3a5900cce9090fa8f1f418ddf2416d171d225eb1d604a613c491" exitCode=0 Mar 10 23:09:30 crc kubenswrapper[4919]: I0310 23:09:30.005326 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"c77b50dfa88f3a5900cce9090fa8f1f418ddf2416d171d225eb1d604a613c491"} Mar 10 23:09:30 crc kubenswrapper[4919]: I0310 23:09:30.005559 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8"} Mar 10 23:09:30 crc kubenswrapper[4919]: I0310 23:09:30.005613 4919 scope.go:117] "RemoveContainer" containerID="46d1affa4c8b3ba64d85713f9411fabf43bf277ed9af6d487847e1f0fe2d5cc6" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.145284 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553070-zsclc"] Mar 10 23:10:00 crc kubenswrapper[4919]: E0310 23:10:00.146180 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5104d1b8-7a21-4c03-9fe5-0c1603184314" containerName="oc" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.146201 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5104d1b8-7a21-4c03-9fe5-0c1603184314" containerName="oc" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.146528 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5104d1b8-7a21-4c03-9fe5-0c1603184314" containerName="oc" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.147179 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553070-zsclc" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.150643 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.150653 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.150876 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.184575 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553070-zsclc"] Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.251783 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vqv\" (UniqueName: \"kubernetes.io/projected/9bb1dc8b-5874-454e-b465-9b65f7510343-kube-api-access-c7vqv\") pod \"auto-csr-approver-29553070-zsclc\" (UID: \"9bb1dc8b-5874-454e-b465-9b65f7510343\") " pod="openshift-infra/auto-csr-approver-29553070-zsclc" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.353814 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vqv\" (UniqueName: \"kubernetes.io/projected/9bb1dc8b-5874-454e-b465-9b65f7510343-kube-api-access-c7vqv\") pod \"auto-csr-approver-29553070-zsclc\" (UID: \"9bb1dc8b-5874-454e-b465-9b65f7510343\") " pod="openshift-infra/auto-csr-approver-29553070-zsclc" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.376111 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vqv\" (UniqueName: \"kubernetes.io/projected/9bb1dc8b-5874-454e-b465-9b65f7510343-kube-api-access-c7vqv\") pod \"auto-csr-approver-29553070-zsclc\" (UID: \"9bb1dc8b-5874-454e-b465-9b65f7510343\") " pod="openshift-infra/auto-csr-approver-29553070-zsclc" Mar 10 23:10:00 crc kubenswrapper[4919]: I0310 23:10:00.488099 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553070-zsclc" Mar 10 23:10:01 crc kubenswrapper[4919]: I0310 23:10:01.057737 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553070-zsclc"] Mar 10 23:10:01 crc kubenswrapper[4919]: I0310 23:10:01.066468 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 23:10:01 crc kubenswrapper[4919]: I0310 23:10:01.600337 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553070-zsclc" event={"ID":"9bb1dc8b-5874-454e-b465-9b65f7510343","Type":"ContainerStarted","Data":"7d91aa0392a88dfcf6500635d1cdfcd71a735c3054f87fe08e524a528ba9cc28"} Mar 10 23:10:02 crc kubenswrapper[4919]: I0310 23:10:02.612630 4919 generic.go:334] "Generic (PLEG): container finished" podID="9bb1dc8b-5874-454e-b465-9b65f7510343" containerID="8916826393f6f2a4e50449a00083bea961de7dafcc8cdd0c59556731ca7a7f5b" exitCode=0 Mar 10 23:10:02 crc kubenswrapper[4919]: I0310 23:10:02.612688 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553070-zsclc" event={"ID":"9bb1dc8b-5874-454e-b465-9b65f7510343","Type":"ContainerDied","Data":"8916826393f6f2a4e50449a00083bea961de7dafcc8cdd0c59556731ca7a7f5b"} Mar 10 23:10:03 crc kubenswrapper[4919]: I0310 23:10:03.869366 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553070-zsclc" Mar 10 23:10:03 crc kubenswrapper[4919]: I0310 23:10:03.901714 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7vqv\" (UniqueName: \"kubernetes.io/projected/9bb1dc8b-5874-454e-b465-9b65f7510343-kube-api-access-c7vqv\") pod \"9bb1dc8b-5874-454e-b465-9b65f7510343\" (UID: \"9bb1dc8b-5874-454e-b465-9b65f7510343\") " Mar 10 23:10:03 crc kubenswrapper[4919]: I0310 23:10:03.908003 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb1dc8b-5874-454e-b465-9b65f7510343-kube-api-access-c7vqv" (OuterVolumeSpecName: "kube-api-access-c7vqv") pod "9bb1dc8b-5874-454e-b465-9b65f7510343" (UID: "9bb1dc8b-5874-454e-b465-9b65f7510343"). InnerVolumeSpecName "kube-api-access-c7vqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:10:04 crc kubenswrapper[4919]: I0310 23:10:04.002987 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7vqv\" (UniqueName: \"kubernetes.io/projected/9bb1dc8b-5874-454e-b465-9b65f7510343-kube-api-access-c7vqv\") on node \"crc\" DevicePath \"\"" Mar 10 23:10:04 crc kubenswrapper[4919]: I0310 23:10:04.625932 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553070-zsclc" event={"ID":"9bb1dc8b-5874-454e-b465-9b65f7510343","Type":"ContainerDied","Data":"7d91aa0392a88dfcf6500635d1cdfcd71a735c3054f87fe08e524a528ba9cc28"} Mar 10 23:10:04 crc kubenswrapper[4919]: I0310 23:10:04.625971 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d91aa0392a88dfcf6500635d1cdfcd71a735c3054f87fe08e524a528ba9cc28" Mar 10 23:10:04 crc kubenswrapper[4919]: I0310 23:10:04.626044 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553070-zsclc" Mar 10 23:10:04 crc kubenswrapper[4919]: I0310 23:10:04.932475 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553064-mshxx"] Mar 10 23:10:04 crc kubenswrapper[4919]: I0310 23:10:04.937432 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553064-mshxx"] Mar 10 23:10:05 crc kubenswrapper[4919]: I0310 23:10:05.487925 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f277f7-0813-4203-bf70-9fb67673689e" path="/var/lib/kubelet/pods/e9f277f7-0813-4203-bf70-9fb67673689e/volumes" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.105670 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nvgms"] Mar 10 23:10:30 crc kubenswrapper[4919]: E0310 23:10:30.106771 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb1dc8b-5874-454e-b465-9b65f7510343" containerName="oc" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.106792 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb1dc8b-5874-454e-b465-9b65f7510343" containerName="oc" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.107002 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb1dc8b-5874-454e-b465-9b65f7510343" containerName="oc" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.108571 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.119995 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvgms"] Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.278073 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvg6j\" (UniqueName: \"kubernetes.io/projected/397eb671-4b29-4a10-8ce3-7f45025b5ccf-kube-api-access-fvg6j\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.278126 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-utilities\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.278342 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-catalog-content\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.379971 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-catalog-content\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.380100 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvg6j\" (UniqueName: \"kubernetes.io/projected/397eb671-4b29-4a10-8ce3-7f45025b5ccf-kube-api-access-fvg6j\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.380127 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-utilities\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.380652 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-catalog-content\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.380894 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-utilities\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.405561 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvg6j\" (UniqueName: \"kubernetes.io/projected/397eb671-4b29-4a10-8ce3-7f45025b5ccf-kube-api-access-fvg6j\") pod \"certified-operators-nvgms\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.442854 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:30 crc kubenswrapper[4919]: I0310 23:10:30.913848 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvgms"] Mar 10 23:10:31 crc kubenswrapper[4919]: I0310 23:10:31.853129 4919 generic.go:334] "Generic (PLEG): container finished" podID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerID="97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17" exitCode=0 Mar 10 23:10:31 crc kubenswrapper[4919]: I0310 23:10:31.853193 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvgms" event={"ID":"397eb671-4b29-4a10-8ce3-7f45025b5ccf","Type":"ContainerDied","Data":"97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17"} Mar 10 23:10:31 crc kubenswrapper[4919]: I0310 23:10:31.853226 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvgms" event={"ID":"397eb671-4b29-4a10-8ce3-7f45025b5ccf","Type":"ContainerStarted","Data":"d8d775a706890374aaa6f2bf146aba2246fcdebfdd2166bdc39bf3e684a1f981"} Mar 10 23:10:32 crc kubenswrapper[4919]: I0310 23:10:32.861963 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvgms" event={"ID":"397eb671-4b29-4a10-8ce3-7f45025b5ccf","Type":"ContainerStarted","Data":"ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9"} Mar 10 23:10:33 crc kubenswrapper[4919]: I0310 23:10:33.872487 4919 generic.go:334] "Generic (PLEG): container finished" podID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerID="ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9" exitCode=0 Mar 10 23:10:33 crc kubenswrapper[4919]: I0310 23:10:33.872604 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvgms" event={"ID":"397eb671-4b29-4a10-8ce3-7f45025b5ccf","Type":"ContainerDied","Data":"ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9"} Mar 10 23:10:34 crc kubenswrapper[4919]: I0310 23:10:34.882330 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvgms" event={"ID":"397eb671-4b29-4a10-8ce3-7f45025b5ccf","Type":"ContainerStarted","Data":"0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521"} Mar 10 23:10:34 crc kubenswrapper[4919]: I0310 23:10:34.903772 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nvgms" podStartSLOduration=2.221198822 podStartE2EDuration="4.903748359s" podCreationTimestamp="2026-03-10 23:10:30 +0000 UTC" firstStartedPulling="2026-03-10 23:10:31.856089852 +0000 UTC m=+4819.097970460" lastFinishedPulling="2026-03-10 23:10:34.538639389 +0000 UTC m=+4821.780519997" observedRunningTime="2026-03-10 23:10:34.900548592 +0000 UTC m=+4822.142429200" watchObservedRunningTime="2026-03-10 23:10:34.903748359 +0000 UTC m=+4822.145628967" Mar 10 23:10:40 crc kubenswrapper[4919]: I0310 23:10:40.443677 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:40 crc kubenswrapper[4919]: I0310 23:10:40.444367 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:40 crc kubenswrapper[4919]: I0310 23:10:40.508081 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:40 crc kubenswrapper[4919]: I0310 23:10:40.978714 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:41 crc kubenswrapper[4919]: I0310 23:10:41.021886 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvgms"] Mar 10 23:10:42 crc kubenswrapper[4919]: I0310 23:10:42.943368 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nvgms" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerName="registry-server" containerID="cri-o://0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521" gracePeriod=2 Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.410375 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.466944 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-utilities\") pod \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.466997 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-catalog-content\") pod \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.467077 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvg6j\" (UniqueName: \"kubernetes.io/projected/397eb671-4b29-4a10-8ce3-7f45025b5ccf-kube-api-access-fvg6j\") pod \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\" (UID: \"397eb671-4b29-4a10-8ce3-7f45025b5ccf\") " Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.468285 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-utilities" (OuterVolumeSpecName: "utilities") pod "397eb671-4b29-4a10-8ce3-7f45025b5ccf" (UID: "397eb671-4b29-4a10-8ce3-7f45025b5ccf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.475642 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397eb671-4b29-4a10-8ce3-7f45025b5ccf-kube-api-access-fvg6j" (OuterVolumeSpecName: "kube-api-access-fvg6j") pod "397eb671-4b29-4a10-8ce3-7f45025b5ccf" (UID: "397eb671-4b29-4a10-8ce3-7f45025b5ccf"). InnerVolumeSpecName "kube-api-access-fvg6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.568673 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.568711 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvg6j\" (UniqueName: \"kubernetes.io/projected/397eb671-4b29-4a10-8ce3-7f45025b5ccf-kube-api-access-fvg6j\") on node \"crc\" DevicePath \"\"" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.710117 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "397eb671-4b29-4a10-8ce3-7f45025b5ccf" (UID: "397eb671-4b29-4a10-8ce3-7f45025b5ccf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.770668 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397eb671-4b29-4a10-8ce3-7f45025b5ccf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.951820 4919 generic.go:334] "Generic (PLEG): container finished" podID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerID="0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521" exitCode=0 Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.951863 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvgms" event={"ID":"397eb671-4b29-4a10-8ce3-7f45025b5ccf","Type":"ContainerDied","Data":"0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521"} Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.951889 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvgms" event={"ID":"397eb671-4b29-4a10-8ce3-7f45025b5ccf","Type":"ContainerDied","Data":"d8d775a706890374aaa6f2bf146aba2246fcdebfdd2166bdc39bf3e684a1f981"} Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.951894 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvgms" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.951907 4919 scope.go:117] "RemoveContainer" containerID="0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.976752 4919 scope.go:117] "RemoveContainer" containerID="ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9" Mar 10 23:10:43 crc kubenswrapper[4919]: I0310 23:10:43.993872 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvgms"] Mar 10 23:10:44 crc kubenswrapper[4919]: I0310 23:10:44.000717 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nvgms"] Mar 10 23:10:44 crc kubenswrapper[4919]: I0310 23:10:44.005967 4919 scope.go:117] "RemoveContainer" containerID="97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17" Mar 10 23:10:44 crc kubenswrapper[4919]: I0310 23:10:44.024877 4919 scope.go:117] "RemoveContainer" containerID="0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521" Mar 10 23:10:44 crc kubenswrapper[4919]: E0310 23:10:44.025571 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521\": container with ID starting with 0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521 not found: ID does not exist" containerID="0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521" Mar 10 23:10:44 crc kubenswrapper[4919]: I0310 23:10:44.025617 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521"} err="failed to get container status \"0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521\": rpc error: code = NotFound desc = could not find container \"0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521\": container with ID starting with 0b39ae9f2f60960c4c9e6918aede95cf8385ef509ec4d6cb64791bf907c2c521 not found: ID does not exist" Mar 10 23:10:44 crc kubenswrapper[4919]: I0310 23:10:44.025641 4919 scope.go:117] "RemoveContainer" containerID="ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9" Mar 10 23:10:44 crc kubenswrapper[4919]: E0310 23:10:44.026066 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9\": container with ID starting with ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9 not found: ID does not exist" containerID="ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9" Mar 10 23:10:44 crc kubenswrapper[4919]: I0310 23:10:44.026089 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9"} err="failed to get container status \"ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9\": rpc error: code = NotFound desc = could not find container \"ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9\": container with ID starting with ea14123a2ac64601b2c5de5e1d2de2eb58059ad349a797030ca5270520608af9 not found: ID does not exist" Mar 10 23:10:44 crc kubenswrapper[4919]: I0310 23:10:44.026100 4919 scope.go:117] "RemoveContainer" containerID="97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17" Mar 10 23:10:44 crc kubenswrapper[4919]: E0310 23:10:44.026307 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17\": container with ID starting with 97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17 not found: ID does not exist" containerID="97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17" Mar 10 23:10:44 crc kubenswrapper[4919]: I0310 23:10:44.026325 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17"} err="failed to get container status \"97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17\": rpc error: code = NotFound desc = could not find container \"97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17\": container with ID starting with 97933a4b1f61830d8d8ff0698c6412a036efbf45427313c6ac992c6670098b17 not found: ID does not exist" Mar 10 23:10:45 crc kubenswrapper[4919]: I0310 23:10:45.489604 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" path="/var/lib/kubelet/pods/397eb671-4b29-4a10-8ce3-7f45025b5ccf/volumes" Mar 10 23:10:46 crc kubenswrapper[4919]: I0310 23:10:46.247448 4919 scope.go:117] "RemoveContainer" containerID="3a3a38403cb29e1ea22e4e0acac1bc16bacebcf8057d3c9122ea156f5aac2452" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.124542 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4trq2"] Mar 10 23:10:58 crc kubenswrapper[4919]: E0310 23:10:58.125623 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerName="extract-content" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.125644 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerName="extract-content" Mar 10 23:10:58 crc kubenswrapper[4919]: E0310 23:10:58.125674 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerName="registry-server" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.125687 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerName="registry-server" Mar 10 23:10:58 crc kubenswrapper[4919]: E0310 23:10:58.125724 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerName="extract-utilities" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.125738 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerName="extract-utilities" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.125964 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="397eb671-4b29-4a10-8ce3-7f45025b5ccf" containerName="registry-server" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.129568 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.134631 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4trq2"] Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.185118 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-utilities\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.185225 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-catalog-content\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.185301 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scwxz\" (UniqueName: \"kubernetes.io/projected/60b16375-8e31-406f-acd7-5db7a331352e-kube-api-access-scwxz\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.286065 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-utilities\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.286137 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-catalog-content\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.286209 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scwxz\" (UniqueName: \"kubernetes.io/projected/60b16375-8e31-406f-acd7-5db7a331352e-kube-api-access-scwxz\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.286904 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-utilities\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.290821 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-catalog-content\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.330385 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scwxz\" (UniqueName: \"kubernetes.io/projected/60b16375-8e31-406f-acd7-5db7a331352e-kube-api-access-scwxz\") pod \"community-operators-4trq2\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:58 crc kubenswrapper[4919]: I0310 23:10:58.467792 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:10:59 crc kubenswrapper[4919]: I0310 23:10:59.077268 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4trq2"] Mar 10 23:11:00 crc kubenswrapper[4919]: I0310 23:11:00.096438 4919 generic.go:334] "Generic (PLEG): container finished" podID="60b16375-8e31-406f-acd7-5db7a331352e" containerID="ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d" exitCode=0 Mar 10 23:11:00 crc kubenswrapper[4919]: I0310 23:11:00.096568 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4trq2" event={"ID":"60b16375-8e31-406f-acd7-5db7a331352e","Type":"ContainerDied","Data":"ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d"} Mar 10 23:11:00 crc kubenswrapper[4919]: I0310 23:11:00.096837 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4trq2" event={"ID":"60b16375-8e31-406f-acd7-5db7a331352e","Type":"ContainerStarted","Data":"c8bf38e75d9f9e0730ca1b9fdaf6ad2c73ab10d19d3a5efb5699ac880c1ef9c5"} Mar 10 23:11:02 crc kubenswrapper[4919]: I0310 23:11:02.113617 4919 generic.go:334] "Generic (PLEG): container finished" podID="60b16375-8e31-406f-acd7-5db7a331352e" containerID="141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342" exitCode=0 Mar 10 23:11:02 crc kubenswrapper[4919]: I0310 23:11:02.113672 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4trq2" event={"ID":"60b16375-8e31-406f-acd7-5db7a331352e","Type":"ContainerDied","Data":"141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342"} Mar 10 23:11:03 crc kubenswrapper[4919]: I0310 23:11:03.121441 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4trq2" event={"ID":"60b16375-8e31-406f-acd7-5db7a331352e","Type":"ContainerStarted","Data":"0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5"} Mar 10 23:11:03 crc kubenswrapper[4919]: I0310 23:11:03.137310 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4trq2" podStartSLOduration=2.650836274 podStartE2EDuration="5.137287206s" podCreationTimestamp="2026-03-10 23:10:58 +0000 UTC" firstStartedPulling="2026-03-10 23:11:00.098535893 +0000 UTC m=+4847.340416531" lastFinishedPulling="2026-03-10 23:11:02.584986855 +0000 UTC m=+4849.826867463" observedRunningTime="2026-03-10 23:11:03.136737241 +0000 UTC m=+4850.378617869" watchObservedRunningTime="2026-03-10 23:11:03.137287206 +0000 UTC m=+4850.379167834" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.095863 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jjtm5"] Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.099568 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.120801 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjtm5"] Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.188032 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-utilities\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.188111 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427lj\" (UniqueName: \"kubernetes.io/projected/26a4c034-130a-45d2-93f5-581fb0eb09c1-kube-api-access-427lj\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.188144 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-catalog-content\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.288937 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-utilities\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.289017 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-427lj\" (UniqueName: \"kubernetes.io/projected/26a4c034-130a-45d2-93f5-581fb0eb09c1-kube-api-access-427lj\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.289066 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-catalog-content\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.289474 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-utilities\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.289598 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-catalog-content\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.319196 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-427lj\" (UniqueName: \"kubernetes.io/projected/26a4c034-130a-45d2-93f5-581fb0eb09c1-kube-api-access-427lj\") pod \"redhat-marketplace-jjtm5\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.425070 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:05 crc kubenswrapper[4919]: I0310 23:11:05.844121 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjtm5"] Mar 10 23:11:05 crc kubenswrapper[4919]: W0310 23:11:05.854560 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26a4c034_130a_45d2_93f5_581fb0eb09c1.slice/crio-a59a29466d735575a64ca2f840e483f6361ec538741a177e2d32afb5c047d1dd WatchSource:0}: Error finding container a59a29466d735575a64ca2f840e483f6361ec538741a177e2d32afb5c047d1dd: Status 404 returned error can't find the container with id a59a29466d735575a64ca2f840e483f6361ec538741a177e2d32afb5c047d1dd Mar 10 23:11:06 crc kubenswrapper[4919]: I0310 23:11:06.155741 4919 generic.go:334] "Generic (PLEG): container finished" podID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerID="48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6" exitCode=0 Mar 10 23:11:06 crc kubenswrapper[4919]: I0310 23:11:06.155791 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjtm5" event={"ID":"26a4c034-130a-45d2-93f5-581fb0eb09c1","Type":"ContainerDied","Data":"48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6"} Mar 10 23:11:06 crc kubenswrapper[4919]: I0310 23:11:06.156118 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjtm5" event={"ID":"26a4c034-130a-45d2-93f5-581fb0eb09c1","Type":"ContainerStarted","Data":"a59a29466d735575a64ca2f840e483f6361ec538741a177e2d32afb5c047d1dd"} Mar 10 23:11:07 crc kubenswrapper[4919]: I0310 23:11:07.164978 4919 generic.go:334] "Generic (PLEG): container finished" podID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerID="ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a" exitCode=0 Mar 10 23:11:07 crc kubenswrapper[4919]: I0310 23:11:07.165054 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjtm5" event={"ID":"26a4c034-130a-45d2-93f5-581fb0eb09c1","Type":"ContainerDied","Data":"ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a"} Mar 10 23:11:08 crc kubenswrapper[4919]: I0310 23:11:08.178679 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjtm5" event={"ID":"26a4c034-130a-45d2-93f5-581fb0eb09c1","Type":"ContainerStarted","Data":"e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27"} Mar 10 23:11:08 crc kubenswrapper[4919]: I0310 23:11:08.207787 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jjtm5" podStartSLOduration=1.7406828060000001 podStartE2EDuration="3.207765893s" podCreationTimestamp="2026-03-10 23:11:05 +0000 UTC" firstStartedPulling="2026-03-10 23:11:06.157100025 +0000 UTC m=+4853.398980653" lastFinishedPulling="2026-03-10 23:11:07.624183132 +0000 UTC m=+4854.866063740" observedRunningTime="2026-03-10 23:11:08.202914392 +0000 UTC m=+4855.444795000" watchObservedRunningTime="2026-03-10 23:11:08.207765893 +0000 UTC m=+4855.449646511" Mar 10 23:11:08 crc kubenswrapper[4919]: I0310 23:11:08.468612 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:11:08 crc kubenswrapper[4919]: I0310 23:11:08.468649 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:11:08 crc kubenswrapper[4919]: I0310 23:11:08.505091 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:11:09 crc kubenswrapper[4919]: I0310 23:11:09.236788 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:11:10 crc kubenswrapper[4919]: I0310 23:11:10.688546 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4trq2"] Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.201665 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4trq2" podUID="60b16375-8e31-406f-acd7-5db7a331352e" containerName="registry-server" containerID="cri-o://0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5" gracePeriod=2 Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.598130 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.684969 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-catalog-content\") pod \"60b16375-8e31-406f-acd7-5db7a331352e\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.685041 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-utilities\") pod \"60b16375-8e31-406f-acd7-5db7a331352e\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.685062 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scwxz\" (UniqueName: \"kubernetes.io/projected/60b16375-8e31-406f-acd7-5db7a331352e-kube-api-access-scwxz\") pod \"60b16375-8e31-406f-acd7-5db7a331352e\" (UID: \"60b16375-8e31-406f-acd7-5db7a331352e\") " Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.686565 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-utilities" (OuterVolumeSpecName: "utilities") pod "60b16375-8e31-406f-acd7-5db7a331352e" (UID: "60b16375-8e31-406f-acd7-5db7a331352e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.691338 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b16375-8e31-406f-acd7-5db7a331352e-kube-api-access-scwxz" (OuterVolumeSpecName: "kube-api-access-scwxz") pod "60b16375-8e31-406f-acd7-5db7a331352e" (UID: "60b16375-8e31-406f-acd7-5db7a331352e"). InnerVolumeSpecName "kube-api-access-scwxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.746294 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60b16375-8e31-406f-acd7-5db7a331352e" (UID: "60b16375-8e31-406f-acd7-5db7a331352e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.787199 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scwxz\" (UniqueName: \"kubernetes.io/projected/60b16375-8e31-406f-acd7-5db7a331352e-kube-api-access-scwxz\") on node \"crc\" DevicePath \"\"" Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.787234 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:11:11 crc kubenswrapper[4919]: I0310 23:11:11.787245 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b16375-8e31-406f-acd7-5db7a331352e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.209684 4919 generic.go:334] "Generic (PLEG): container finished" podID="60b16375-8e31-406f-acd7-5db7a331352e" containerID="0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5" exitCode=0 Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.209736 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4trq2" event={"ID":"60b16375-8e31-406f-acd7-5db7a331352e","Type":"ContainerDied","Data":"0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5"} Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.209806 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4trq2" event={"ID":"60b16375-8e31-406f-acd7-5db7a331352e","Type":"ContainerDied","Data":"c8bf38e75d9f9e0730ca1b9fdaf6ad2c73ab10d19d3a5efb5699ac880c1ef9c5"} Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.209805 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4trq2" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.209827 4919 scope.go:117] "RemoveContainer" containerID="0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.235984 4919 scope.go:117] "RemoveContainer" containerID="141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.247163 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4trq2"] Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.252522 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4trq2"] Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.269737 4919 scope.go:117] "RemoveContainer" containerID="ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.290843 4919 scope.go:117] "RemoveContainer" containerID="0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5" Mar 10 23:11:12 crc kubenswrapper[4919]: E0310 23:11:12.291456 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5\": container with ID starting with 0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5 not found: ID does not exist" containerID="0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.291499 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5"} err="failed to get container status \"0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5\": rpc error: code = NotFound desc = could not find container \"0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5\": container with ID starting with 0b58ba3c7286d57f3ed75fce8208a120842ee22bf0df28d947f27e38ddab15a5 not found: ID does not exist" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.291525 4919 scope.go:117] "RemoveContainer" containerID="141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342" Mar 10 23:11:12 crc kubenswrapper[4919]: E0310 23:11:12.292003 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342\": container with ID starting with 141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342 not found: ID does not exist" containerID="141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.292044 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342"} err="failed to get container status \"141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342\": rpc error: code = NotFound desc = could not find container \"141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342\": container with ID starting with 141a7ff55d5fc927439d75fd58a3adc5a5e19eefe2c7e4a592ac5c4f74f48342 not found: ID does not exist" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.292079 4919 scope.go:117] "RemoveContainer" containerID="ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d" Mar 10 23:11:12 crc kubenswrapper[4919]: E0310 23:11:12.292371 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d\": container with ID starting with ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d not found: ID does not exist" containerID="ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d" Mar 10 23:11:12 crc kubenswrapper[4919]: I0310 23:11:12.292413 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d"} err="failed to get container status \"ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d\": rpc error: code = NotFound desc = could not find container \"ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d\": container with ID starting with ab56340798ad1357fa00759aa5cf98ba40676710be66a48730813fd392b74c4d not found: ID does not exist" Mar 10 23:11:13 crc kubenswrapper[4919]: I0310 23:11:13.495204 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b16375-8e31-406f-acd7-5db7a331352e" path="/var/lib/kubelet/pods/60b16375-8e31-406f-acd7-5db7a331352e/volumes" Mar 10 23:11:15 crc kubenswrapper[4919]: I0310 23:11:15.426039 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:15 crc kubenswrapper[4919]: I0310 23:11:15.426606 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:15 crc kubenswrapper[4919]: I0310 23:11:15.476759 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:16 crc kubenswrapper[4919]: I0310 23:11:16.282552 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:16 crc kubenswrapper[4919]: I0310 23:11:16.324602 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjtm5"] Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.256002 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jjtm5" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerName="registry-server" containerID="cri-o://e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27" gracePeriod=2 Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.740858 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.779607 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-catalog-content\") pod \"26a4c034-130a-45d2-93f5-581fb0eb09c1\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.779654 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-427lj\" (UniqueName: \"kubernetes.io/projected/26a4c034-130a-45d2-93f5-581fb0eb09c1-kube-api-access-427lj\") pod \"26a4c034-130a-45d2-93f5-581fb0eb09c1\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.780734 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-utilities\") pod \"26a4c034-130a-45d2-93f5-581fb0eb09c1\" (UID: \"26a4c034-130a-45d2-93f5-581fb0eb09c1\") " Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.782036 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-utilities" (OuterVolumeSpecName: "utilities") pod "26a4c034-130a-45d2-93f5-581fb0eb09c1" (UID: "26a4c034-130a-45d2-93f5-581fb0eb09c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.785055 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a4c034-130a-45d2-93f5-581fb0eb09c1-kube-api-access-427lj" (OuterVolumeSpecName: "kube-api-access-427lj") pod "26a4c034-130a-45d2-93f5-581fb0eb09c1" (UID: "26a4c034-130a-45d2-93f5-581fb0eb09c1"). InnerVolumeSpecName "kube-api-access-427lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.813053 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26a4c034-130a-45d2-93f5-581fb0eb09c1" (UID: "26a4c034-130a-45d2-93f5-581fb0eb09c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.881990 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.882032 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a4c034-130a-45d2-93f5-581fb0eb09c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:11:18 crc kubenswrapper[4919]: I0310 23:11:18.882049 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-427lj\" (UniqueName: \"kubernetes.io/projected/26a4c034-130a-45d2-93f5-581fb0eb09c1-kube-api-access-427lj\") on node \"crc\" DevicePath \"\"" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.266974 4919 generic.go:334] "Generic (PLEG): container finished" podID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerID="e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27" exitCode=0 Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.267065 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjtm5" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.267079 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjtm5" event={"ID":"26a4c034-130a-45d2-93f5-581fb0eb09c1","Type":"ContainerDied","Data":"e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27"} Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.267517 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjtm5" event={"ID":"26a4c034-130a-45d2-93f5-581fb0eb09c1","Type":"ContainerDied","Data":"a59a29466d735575a64ca2f840e483f6361ec538741a177e2d32afb5c047d1dd"} Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.267553 4919 scope.go:117] "RemoveContainer" containerID="e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.294564 4919 scope.go:117] "RemoveContainer" containerID="ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.315728 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjtm5"] Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.321125 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjtm5"] Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.491339 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" path="/var/lib/kubelet/pods/26a4c034-130a-45d2-93f5-581fb0eb09c1/volumes" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.597894 4919 scope.go:117] "RemoveContainer" containerID="48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.650858 4919 scope.go:117] "RemoveContainer" containerID="e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27" Mar 10 23:11:19 crc kubenswrapper[4919]: E0310 23:11:19.651402 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27\": container with ID starting with e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27 not found: ID does not exist" containerID="e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.651566 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27"} err="failed to get container status \"e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27\": rpc error: code = NotFound desc = could not find container \"e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27\": container with ID starting with e70c58d088f892413bc7b1d8c99c9e0a14122569cd0a49f7204393164d5a9b27 not found: ID does not exist" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.651678 4919 scope.go:117] "RemoveContainer" containerID="ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a" Mar 10 23:11:19 crc kubenswrapper[4919]: E0310 23:11:19.652378 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a\": container with ID starting with ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a not found: ID does not exist" containerID="ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.652467 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a"} err="failed to get container status \"ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a\": rpc error: code = NotFound desc = could not find container \"ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a\": container with ID starting with ad22d79a380932e1616bcf2e530dff07f56c8eae8671aefefb4354f68821310a not found: ID does not exist" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.652515 4919 scope.go:117] "RemoveContainer" containerID="48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6" Mar 10 23:11:19 crc kubenswrapper[4919]: E0310 23:11:19.652886 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6\": container with ID starting with 48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6 not found: ID does not exist" containerID="48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6" Mar 10 23:11:19 crc kubenswrapper[4919]: I0310 23:11:19.652929 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6"} err="failed to get container status \"48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6\": rpc error: code = NotFound desc = could not find container \"48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6\": container with ID starting with 48f982315dac96bd677bb4cc2458c0463c55a7fb8399664654899ab3dc0431f6 not found: ID does not exist" Mar 10 23:11:29 crc kubenswrapper[4919]: I0310 23:11:29.175369 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:11:29 crc kubenswrapper[4919]: I0310 23:11:29.176046 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:11:59 crc kubenswrapper[4919]: I0310 23:11:59.176262 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:11:59 crc kubenswrapper[4919]: I0310 23:11:59.177379 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.159211 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553072-zq5sf"] Mar 10 23:12:00 crc kubenswrapper[4919]: E0310 23:12:00.159704 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b16375-8e31-406f-acd7-5db7a331352e" containerName="extract-content" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.159733 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b16375-8e31-406f-acd7-5db7a331352e" containerName="extract-content" Mar 10 23:12:00 crc kubenswrapper[4919]: E0310 23:12:00.159774 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b16375-8e31-406f-acd7-5db7a331352e" containerName="registry-server" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.159788 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b16375-8e31-406f-acd7-5db7a331352e" containerName="registry-server" Mar 10 23:12:00 crc kubenswrapper[4919]: E0310 23:12:00.159804 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerName="extract-utilities" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.159816 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerName="extract-utilities" Mar 10 23:12:00 crc kubenswrapper[4919]: E0310 23:12:00.159837 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b16375-8e31-406f-acd7-5db7a331352e" containerName="extract-utilities" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.159849 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b16375-8e31-406f-acd7-5db7a331352e" containerName="extract-utilities" Mar 10 23:12:00 crc kubenswrapper[4919]: E0310 23:12:00.159875 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerName="registry-server" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.159886 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerName="registry-server" Mar 10 23:12:00 crc kubenswrapper[4919]: E0310 23:12:00.159900 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerName="extract-content" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.159910 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerName="extract-content" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.160153 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b16375-8e31-406f-acd7-5db7a331352e" containerName="registry-server" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.160174 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a4c034-130a-45d2-93f5-581fb0eb09c1" containerName="registry-server" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.160948 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.163131 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.163601 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.163650 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.166748 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553072-zq5sf"] Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.243057 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnkn2\" (UniqueName: \"kubernetes.io/projected/a67e1f39-f53e-4fdc-a7df-612e0f16da3d-kube-api-access-mnkn2\") pod \"auto-csr-approver-29553072-zq5sf\" (UID: \"a67e1f39-f53e-4fdc-a7df-612e0f16da3d\") " pod="openshift-infra/auto-csr-approver-29553072-zq5sf" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.344689 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnkn2\" (UniqueName: \"kubernetes.io/projected/a67e1f39-f53e-4fdc-a7df-612e0f16da3d-kube-api-access-mnkn2\") pod \"auto-csr-approver-29553072-zq5sf\" (UID: \"a67e1f39-f53e-4fdc-a7df-612e0f16da3d\") " pod="openshift-infra/auto-csr-approver-29553072-zq5sf" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.364913 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnkn2\" (UniqueName: \"kubernetes.io/projected/a67e1f39-f53e-4fdc-a7df-612e0f16da3d-kube-api-access-mnkn2\") pod \"auto-csr-approver-29553072-zq5sf\" (UID: \"a67e1f39-f53e-4fdc-a7df-612e0f16da3d\") " pod="openshift-infra/auto-csr-approver-29553072-zq5sf" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.484330 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" Mar 10 23:12:00 crc kubenswrapper[4919]: I0310 23:12:00.995734 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553072-zq5sf"] Mar 10 23:12:01 crc kubenswrapper[4919]: W0310 23:12:01.000730 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda67e1f39_f53e_4fdc_a7df_612e0f16da3d.slice/crio-176f6fd0521e1ffa90baab1bfbf85d53cb581b9961f974a2deb4c41b8149d022 WatchSource:0}: Error finding container 176f6fd0521e1ffa90baab1bfbf85d53cb581b9961f974a2deb4c41b8149d022: Status 404 returned error can't find the container with id 176f6fd0521e1ffa90baab1bfbf85d53cb581b9961f974a2deb4c41b8149d022 Mar 10 23:12:01 crc kubenswrapper[4919]: I0310 23:12:01.629457 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" event={"ID":"a67e1f39-f53e-4fdc-a7df-612e0f16da3d","Type":"ContainerStarted","Data":"176f6fd0521e1ffa90baab1bfbf85d53cb581b9961f974a2deb4c41b8149d022"} Mar 10 23:12:02 crc kubenswrapper[4919]: I0310 23:12:02.642865 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" event={"ID":"a67e1f39-f53e-4fdc-a7df-612e0f16da3d","Type":"ContainerStarted","Data":"66e3d69b634938b442f517d1b95772cfc4d8e1093abee852a01f740f4406a872"} Mar 10 23:12:02 crc kubenswrapper[4919]: I0310 23:12:02.660382 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" podStartSLOduration=1.7187240670000001 podStartE2EDuration="2.660358735s" podCreationTimestamp="2026-03-10 23:12:00 +0000 UTC" firstStartedPulling="2026-03-10 23:12:01.004681519 +0000 UTC m=+4908.246562137" lastFinishedPulling="2026-03-10 23:12:01.946316197 +0000 UTC m=+4909.188196805" observedRunningTime="2026-03-10 23:12:02.659198393 +0000 UTC m=+4909.901079011" watchObservedRunningTime="2026-03-10 23:12:02.660358735 +0000 UTC m=+4909.902239343" Mar 10 23:12:03 crc kubenswrapper[4919]: I0310 23:12:03.652604 4919 generic.go:334] "Generic (PLEG): container finished" podID="a67e1f39-f53e-4fdc-a7df-612e0f16da3d" containerID="66e3d69b634938b442f517d1b95772cfc4d8e1093abee852a01f740f4406a872" exitCode=0 Mar 10 23:12:03 crc kubenswrapper[4919]: I0310 23:12:03.652658 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" event={"ID":"a67e1f39-f53e-4fdc-a7df-612e0f16da3d","Type":"ContainerDied","Data":"66e3d69b634938b442f517d1b95772cfc4d8e1093abee852a01f740f4406a872"} Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.012137 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.122103 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnkn2\" (UniqueName: \"kubernetes.io/projected/a67e1f39-f53e-4fdc-a7df-612e0f16da3d-kube-api-access-mnkn2\") pod \"a67e1f39-f53e-4fdc-a7df-612e0f16da3d\" (UID: \"a67e1f39-f53e-4fdc-a7df-612e0f16da3d\") " Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.131282 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a67e1f39-f53e-4fdc-a7df-612e0f16da3d-kube-api-access-mnkn2" (OuterVolumeSpecName: "kube-api-access-mnkn2") pod "a67e1f39-f53e-4fdc-a7df-612e0f16da3d" (UID: "a67e1f39-f53e-4fdc-a7df-612e0f16da3d"). InnerVolumeSpecName "kube-api-access-mnkn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.223437 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnkn2\" (UniqueName: \"kubernetes.io/projected/a67e1f39-f53e-4fdc-a7df-612e0f16da3d-kube-api-access-mnkn2\") on node \"crc\" DevicePath \"\"" Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.672872 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" event={"ID":"a67e1f39-f53e-4fdc-a7df-612e0f16da3d","Type":"ContainerDied","Data":"176f6fd0521e1ffa90baab1bfbf85d53cb581b9961f974a2deb4c41b8149d022"} Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.672917 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176f6fd0521e1ffa90baab1bfbf85d53cb581b9961f974a2deb4c41b8149d022" Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.672961 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553072-zq5sf" Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.746921 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553066-6ks5t"] Mar 10 23:12:05 crc kubenswrapper[4919]: I0310 23:12:05.755930 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553066-6ks5t"] Mar 10 23:12:07 crc kubenswrapper[4919]: I0310 23:12:07.496816 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d2345c-1f62-493a-80e7-3d6db5051332" path="/var/lib/kubelet/pods/e3d2345c-1f62-493a-80e7-3d6db5051332/volumes" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.119863 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-k4lxh"] Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.130228 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-k4lxh"] Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.269490 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-6rrv5"] Mar 10 23:12:13 crc kubenswrapper[4919]: E0310 23:12:13.269982 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a67e1f39-f53e-4fdc-a7df-612e0f16da3d" containerName="oc" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.270021 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a67e1f39-f53e-4fdc-a7df-612e0f16da3d" containerName="oc" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.270307 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a67e1f39-f53e-4fdc-a7df-612e0f16da3d" containerName="oc" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.271111 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.280065 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.280594 4919 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-st8sh" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.281258 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.282121 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.300367 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6rrv5"] Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.372050 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b5fd796c-2324-4457-a4c6-1c202001d84a-crc-storage\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.372232 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b5fd796c-2324-4457-a4c6-1c202001d84a-node-mnt\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.372503 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4zl\" (UniqueName: \"kubernetes.io/projected/b5fd796c-2324-4457-a4c6-1c202001d84a-kube-api-access-sb4zl\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.474127 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4zl\" (UniqueName: \"kubernetes.io/projected/b5fd796c-2324-4457-a4c6-1c202001d84a-kube-api-access-sb4zl\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.474260 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b5fd796c-2324-4457-a4c6-1c202001d84a-crc-storage\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.474344 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b5fd796c-2324-4457-a4c6-1c202001d84a-node-mnt\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.474675 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b5fd796c-2324-4457-a4c6-1c202001d84a-node-mnt\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.476266 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.486286 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b5fd796c-2324-4457-a4c6-1c202001d84a-crc-storage\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.489245 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.490629 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e35b2a-3e5e-44a4-a9e4-fbe83610e813" path="/var/lib/kubelet/pods/31e35b2a-3e5e-44a4-a9e4-fbe83610e813/volumes" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.499516 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.517423 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4zl\" (UniqueName: \"kubernetes.io/projected/b5fd796c-2324-4457-a4c6-1c202001d84a-kube-api-access-sb4zl\") pod \"crc-storage-crc-6rrv5\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.640739 4919 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-st8sh" Mar 10 23:12:13 crc kubenswrapper[4919]: I0310 23:12:13.649519 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:14 crc kubenswrapper[4919]: I0310 23:12:14.102560 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6rrv5"] Mar 10 23:12:14 crc kubenswrapper[4919]: I0310 23:12:14.772716 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6rrv5" event={"ID":"b5fd796c-2324-4457-a4c6-1c202001d84a","Type":"ContainerStarted","Data":"9b6cf3fc45d9272b53b66767ea258bcfba152658b6a1554ce4186fe1765f443b"} Mar 10 23:12:15 crc kubenswrapper[4919]: I0310 23:12:15.779109 4919 generic.go:334] "Generic (PLEG): container finished" podID="b5fd796c-2324-4457-a4c6-1c202001d84a" containerID="a60687c7d2ead121ac4cc1ccf2d34efe00ca55b38145c8900e1ae0d4742a56ce" exitCode=0 Mar 10 23:12:15 crc kubenswrapper[4919]: I0310 23:12:15.779291 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6rrv5" event={"ID":"b5fd796c-2324-4457-a4c6-1c202001d84a","Type":"ContainerDied","Data":"a60687c7d2ead121ac4cc1ccf2d34efe00ca55b38145c8900e1ae0d4742a56ce"} Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.150668 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.228627 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb4zl\" (UniqueName: \"kubernetes.io/projected/b5fd796c-2324-4457-a4c6-1c202001d84a-kube-api-access-sb4zl\") pod \"b5fd796c-2324-4457-a4c6-1c202001d84a\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.228832 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b5fd796c-2324-4457-a4c6-1c202001d84a-crc-storage\") pod \"b5fd796c-2324-4457-a4c6-1c202001d84a\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.228865 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b5fd796c-2324-4457-a4c6-1c202001d84a-node-mnt\") pod \"b5fd796c-2324-4457-a4c6-1c202001d84a\" (UID: \"b5fd796c-2324-4457-a4c6-1c202001d84a\") " Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.228911 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5fd796c-2324-4457-a4c6-1c202001d84a-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b5fd796c-2324-4457-a4c6-1c202001d84a" (UID: "b5fd796c-2324-4457-a4c6-1c202001d84a"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.229139 4919 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b5fd796c-2324-4457-a4c6-1c202001d84a-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.235430 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5fd796c-2324-4457-a4c6-1c202001d84a-kube-api-access-sb4zl" (OuterVolumeSpecName: "kube-api-access-sb4zl") pod "b5fd796c-2324-4457-a4c6-1c202001d84a" (UID: "b5fd796c-2324-4457-a4c6-1c202001d84a"). InnerVolumeSpecName "kube-api-access-sb4zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.260045 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5fd796c-2324-4457-a4c6-1c202001d84a-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b5fd796c-2324-4457-a4c6-1c202001d84a" (UID: "b5fd796c-2324-4457-a4c6-1c202001d84a"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.331066 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb4zl\" (UniqueName: \"kubernetes.io/projected/b5fd796c-2324-4457-a4c6-1c202001d84a-kube-api-access-sb4zl\") on node \"crc\" DevicePath \"\"" Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.331121 4919 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b5fd796c-2324-4457-a4c6-1c202001d84a-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.802457 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6rrv5" event={"ID":"b5fd796c-2324-4457-a4c6-1c202001d84a","Type":"ContainerDied","Data":"9b6cf3fc45d9272b53b66767ea258bcfba152658b6a1554ce4186fe1765f443b"} Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.802544 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6rrv5" Mar 10 23:12:17 crc kubenswrapper[4919]: I0310 23:12:17.802974 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b6cf3fc45d9272b53b66767ea258bcfba152658b6a1554ce4186fe1765f443b" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.567489 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-6rrv5"] Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.572291 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-6rrv5"] Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.748299 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bxhv8"] Mar 10 23:12:19 crc kubenswrapper[4919]: E0310 23:12:19.749259 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fd796c-2324-4457-a4c6-1c202001d84a" containerName="storage" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.749301 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fd796c-2324-4457-a4c6-1c202001d84a" containerName="storage" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.749649 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5fd796c-2324-4457-a4c6-1c202001d84a" containerName="storage" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.750503 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.753966 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.754270 4919 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-st8sh" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.754434 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.754543 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.756074 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bxhv8"] Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.889157 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl8th\" (UniqueName: \"kubernetes.io/projected/fa1616ed-c076-4097-8235-1a846626399b-kube-api-access-dl8th\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.889236 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fa1616ed-c076-4097-8235-1a846626399b-node-mnt\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.889265 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fa1616ed-c076-4097-8235-1a846626399b-crc-storage\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.990368 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fa1616ed-c076-4097-8235-1a846626399b-node-mnt\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.990470 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fa1616ed-c076-4097-8235-1a846626399b-crc-storage\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.990624 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl8th\" (UniqueName: \"kubernetes.io/projected/fa1616ed-c076-4097-8235-1a846626399b-kube-api-access-dl8th\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.990714 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fa1616ed-c076-4097-8235-1a846626399b-node-mnt\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:19 crc kubenswrapper[4919]: I0310 23:12:19.991483 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fa1616ed-c076-4097-8235-1a846626399b-crc-storage\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:20 crc kubenswrapper[4919]: I0310 23:12:20.013379 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl8th\" (UniqueName: \"kubernetes.io/projected/fa1616ed-c076-4097-8235-1a846626399b-kube-api-access-dl8th\") pod \"crc-storage-crc-bxhv8\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:20 crc kubenswrapper[4919]: I0310 23:12:20.068370 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:20 crc kubenswrapper[4919]: I0310 23:12:20.487041 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bxhv8"] Mar 10 23:12:20 crc kubenswrapper[4919]: I0310 23:12:20.828615 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxhv8" event={"ID":"fa1616ed-c076-4097-8235-1a846626399b","Type":"ContainerStarted","Data":"92035724275b3d390b199123f412b9e3e90578a9c4b6a5b93fe4b249b690108a"} Mar 10 23:12:21 crc kubenswrapper[4919]: I0310 23:12:21.487896 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5fd796c-2324-4457-a4c6-1c202001d84a" path="/var/lib/kubelet/pods/b5fd796c-2324-4457-a4c6-1c202001d84a/volumes" Mar 10 23:12:21 crc kubenswrapper[4919]: I0310 23:12:21.836750 4919 generic.go:334] "Generic (PLEG): container finished" podID="fa1616ed-c076-4097-8235-1a846626399b" containerID="dee16c3ed936a6ccc4296002000a3ec6d27131cd568379d200876eb3b7b617ac" exitCode=0 Mar 10 23:12:21 crc kubenswrapper[4919]: I0310 23:12:21.836805 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxhv8" event={"ID":"fa1616ed-c076-4097-8235-1a846626399b","Type":"ContainerDied","Data":"dee16c3ed936a6ccc4296002000a3ec6d27131cd568379d200876eb3b7b617ac"} Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.090766 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.236931 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl8th\" (UniqueName: \"kubernetes.io/projected/fa1616ed-c076-4097-8235-1a846626399b-kube-api-access-dl8th\") pod \"fa1616ed-c076-4097-8235-1a846626399b\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.236994 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fa1616ed-c076-4097-8235-1a846626399b-crc-storage\") pod \"fa1616ed-c076-4097-8235-1a846626399b\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.237111 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fa1616ed-c076-4097-8235-1a846626399b-node-mnt\") pod \"fa1616ed-c076-4097-8235-1a846626399b\" (UID: \"fa1616ed-c076-4097-8235-1a846626399b\") " Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.237381 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1616ed-c076-4097-8235-1a846626399b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fa1616ed-c076-4097-8235-1a846626399b" (UID: "fa1616ed-c076-4097-8235-1a846626399b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.243155 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1616ed-c076-4097-8235-1a846626399b-kube-api-access-dl8th" (OuterVolumeSpecName: "kube-api-access-dl8th") pod "fa1616ed-c076-4097-8235-1a846626399b" (UID: "fa1616ed-c076-4097-8235-1a846626399b"). InnerVolumeSpecName "kube-api-access-dl8th". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.261120 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1616ed-c076-4097-8235-1a846626399b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fa1616ed-c076-4097-8235-1a846626399b" (UID: "fa1616ed-c076-4097-8235-1a846626399b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.338471 4919 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fa1616ed-c076-4097-8235-1a846626399b-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.338506 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl8th\" (UniqueName: \"kubernetes.io/projected/fa1616ed-c076-4097-8235-1a846626399b-kube-api-access-dl8th\") on node \"crc\" DevicePath \"\"" Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.338521 4919 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fa1616ed-c076-4097-8235-1a846626399b-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.854082 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxhv8" event={"ID":"fa1616ed-c076-4097-8235-1a846626399b","Type":"ContainerDied","Data":"92035724275b3d390b199123f412b9e3e90578a9c4b6a5b93fe4b249b690108a"} Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.854142 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92035724275b3d390b199123f412b9e3e90578a9c4b6a5b93fe4b249b690108a" Mar 10 23:12:23 crc kubenswrapper[4919]: I0310 23:12:23.854144 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxhv8" Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.175467 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.176235 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.176340 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.177204 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.177349 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" gracePeriod=600 Mar 10 23:12:29 crc kubenswrapper[4919]: E0310 23:12:29.307452 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.906787 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" exitCode=0 Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.906849 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8"} Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.906887 4919 scope.go:117] "RemoveContainer" containerID="c77b50dfa88f3a5900cce9090fa8f1f418ddf2416d171d225eb1d604a613c491" Mar 10 23:12:29 crc kubenswrapper[4919]: I0310 23:12:29.907542 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:12:29 crc kubenswrapper[4919]: E0310 23:12:29.907931 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:12:43 crc kubenswrapper[4919]: I0310 23:12:43.488205 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:12:43 crc kubenswrapper[4919]: E0310 23:12:43.489173 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:12:46 crc kubenswrapper[4919]: I0310 23:12:46.379446 4919 scope.go:117] "RemoveContainer" containerID="de91a2f2adf04a60bdb3422734b5a3bd8eb19086cb6b976c1920bf52ac6f84c4" Mar 10 23:12:46 crc kubenswrapper[4919]: I0310 23:12:46.422930 4919 scope.go:117] "RemoveContainer" containerID="87bb54e220785a904603c96f9e6b9e0a04753b2a85ea87b9d6040acf8421b5b9" Mar 10 23:12:57 crc kubenswrapper[4919]: I0310 23:12:57.480814 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:12:57 crc kubenswrapper[4919]: E0310 23:12:57.483534 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:13:09 crc kubenswrapper[4919]: I0310 23:13:09.480619 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:13:09 crc kubenswrapper[4919]: E0310 23:13:09.481647 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:13:23 crc kubenswrapper[4919]: I0310 23:13:23.484736 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:13:23 crc kubenswrapper[4919]: E0310 23:13:23.485540 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:13:36 crc kubenswrapper[4919]: I0310 23:13:36.480033 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:13:36 crc kubenswrapper[4919]: E0310 23:13:36.480947 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:13:47 crc kubenswrapper[4919]: I0310 23:13:47.480732 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:13:47 crc kubenswrapper[4919]: E0310 23:13:47.482000 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:13:59 crc kubenswrapper[4919]: I0310 23:13:59.480812 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:13:59 crc kubenswrapper[4919]: E0310 23:13:59.481536 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.161635 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553074-w4sgs"] Mar 10 23:14:00 crc kubenswrapper[4919]: E0310 23:14:00.162106 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1616ed-c076-4097-8235-1a846626399b" containerName="storage" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.162125 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1616ed-c076-4097-8235-1a846626399b" containerName="storage" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.162372 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1616ed-c076-4097-8235-1a846626399b" containerName="storage" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.163160 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553074-w4sgs" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.171119 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.173704 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553074-w4sgs"] Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.174077 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.174486 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.259955 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5ngb\" (UniqueName: \"kubernetes.io/projected/15dd46e4-0f3d-43d2-9c42-f2538297c74f-kube-api-access-p5ngb\") pod \"auto-csr-approver-29553074-w4sgs\" (UID: \"15dd46e4-0f3d-43d2-9c42-f2538297c74f\") " pod="openshift-infra/auto-csr-approver-29553074-w4sgs" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.361170 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5ngb\" (UniqueName: \"kubernetes.io/projected/15dd46e4-0f3d-43d2-9c42-f2538297c74f-kube-api-access-p5ngb\") pod \"auto-csr-approver-29553074-w4sgs\" (UID: \"15dd46e4-0f3d-43d2-9c42-f2538297c74f\") " pod="openshift-infra/auto-csr-approver-29553074-w4sgs" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.397662 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5ngb\" (UniqueName: \"kubernetes.io/projected/15dd46e4-0f3d-43d2-9c42-f2538297c74f-kube-api-access-p5ngb\") pod \"auto-csr-approver-29553074-w4sgs\" (UID: \"15dd46e4-0f3d-43d2-9c42-f2538297c74f\") " pod="openshift-infra/auto-csr-approver-29553074-w4sgs" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.496773 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553074-w4sgs" Mar 10 23:14:00 crc kubenswrapper[4919]: I0310 23:14:00.904373 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553074-w4sgs"] Mar 10 23:14:01 crc kubenswrapper[4919]: I0310 23:14:01.744273 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553074-w4sgs" event={"ID":"15dd46e4-0f3d-43d2-9c42-f2538297c74f","Type":"ContainerStarted","Data":"3e7cda7f482e33c6f931baf80d6bc672e767d0bee7cdb9046d5cb8d709b63d65"} Mar 10 23:14:02 crc kubenswrapper[4919]: I0310 23:14:02.754819 4919 generic.go:334] "Generic (PLEG): container finished" podID="15dd46e4-0f3d-43d2-9c42-f2538297c74f" containerID="aa5de3339c47524494914d6e6d673b75c0f7cc4913f4875021340bef57256bb5" exitCode=0 Mar 10 23:14:02 crc kubenswrapper[4919]: I0310 23:14:02.754917 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553074-w4sgs" event={"ID":"15dd46e4-0f3d-43d2-9c42-f2538297c74f","Type":"ContainerDied","Data":"aa5de3339c47524494914d6e6d673b75c0f7cc4913f4875021340bef57256bb5"} Mar 10 23:14:04 crc kubenswrapper[4919]: I0310 23:14:04.117216 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553074-w4sgs" Mar 10 23:14:04 crc kubenswrapper[4919]: I0310 23:14:04.223380 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5ngb\" (UniqueName: \"kubernetes.io/projected/15dd46e4-0f3d-43d2-9c42-f2538297c74f-kube-api-access-p5ngb\") pod \"15dd46e4-0f3d-43d2-9c42-f2538297c74f\" (UID: \"15dd46e4-0f3d-43d2-9c42-f2538297c74f\") " Mar 10 23:14:04 crc kubenswrapper[4919]: I0310 23:14:04.228883 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15dd46e4-0f3d-43d2-9c42-f2538297c74f-kube-api-access-p5ngb" (OuterVolumeSpecName: "kube-api-access-p5ngb") pod "15dd46e4-0f3d-43d2-9c42-f2538297c74f" (UID: "15dd46e4-0f3d-43d2-9c42-f2538297c74f"). InnerVolumeSpecName "kube-api-access-p5ngb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:14:04 crc kubenswrapper[4919]: I0310 23:14:04.325364 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5ngb\" (UniqueName: \"kubernetes.io/projected/15dd46e4-0f3d-43d2-9c42-f2538297c74f-kube-api-access-p5ngb\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:04 crc kubenswrapper[4919]: I0310 23:14:04.775840 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553074-w4sgs" event={"ID":"15dd46e4-0f3d-43d2-9c42-f2538297c74f","Type":"ContainerDied","Data":"3e7cda7f482e33c6f931baf80d6bc672e767d0bee7cdb9046d5cb8d709b63d65"} Mar 10 23:14:04 crc kubenswrapper[4919]: I0310 23:14:04.775900 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e7cda7f482e33c6f931baf80d6bc672e767d0bee7cdb9046d5cb8d709b63d65" Mar 10 23:14:04 crc kubenswrapper[4919]: I0310 23:14:04.775968 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553074-w4sgs" Mar 10 23:14:05 crc kubenswrapper[4919]: I0310 23:14:05.222077 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553068-kxqfj"] Mar 10 23:14:05 crc kubenswrapper[4919]: I0310 23:14:05.226784 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553068-kxqfj"] Mar 10 23:14:05 crc kubenswrapper[4919]: I0310 23:14:05.490083 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5104d1b8-7a21-4c03-9fe5-0c1603184314" path="/var/lib/kubelet/pods/5104d1b8-7a21-4c03-9fe5-0c1603184314/volumes" Mar 10 23:14:11 crc kubenswrapper[4919]: I0310 23:14:11.480928 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:14:11 crc kubenswrapper[4919]: E0310 23:14:11.481763 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:14:24 crc kubenswrapper[4919]: I0310 23:14:24.480811 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:14:24 crc kubenswrapper[4919]: E0310 23:14:24.481964 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.873807 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c44667757-vpzm4"] Mar 10 23:14:25 crc kubenswrapper[4919]: E0310 23:14:25.874375 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dd46e4-0f3d-43d2-9c42-f2538297c74f" containerName="oc" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.874400 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dd46e4-0f3d-43d2-9c42-f2538297c74f" containerName="oc" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.874528 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="15dd46e4-0f3d-43d2-9c42-f2538297c74f" containerName="oc" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.875221 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.876982 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.877585 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-74n96" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.877798 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.879338 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.887311 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-kw4sz"] Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.888922 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.901700 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.902475 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-vpzm4"] Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.911967 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-kw4sz"] Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.952083 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znqgd\" (UniqueName: \"kubernetes.io/projected/ed45484d-165d-4b13-8093-5a0ea05d4502-kube-api-access-znqgd\") pod \"dnsmasq-dns-c44667757-vpzm4\" (UID: \"ed45484d-165d-4b13-8093-5a0ea05d4502\") " pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.952160 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-config\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.952196 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44xxc\" (UniqueName: \"kubernetes.io/projected/74bed246-fedc-4d7b-a81e-9217d69613fd-kube-api-access-44xxc\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.952221 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:25 crc kubenswrapper[4919]: I0310 23:14:25.952264 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed45484d-165d-4b13-8093-5a0ea05d4502-config\") pod \"dnsmasq-dns-c44667757-vpzm4\" (UID: \"ed45484d-165d-4b13-8093-5a0ea05d4502\") " pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.053664 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-config\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.053720 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44xxc\" (UniqueName: \"kubernetes.io/projected/74bed246-fedc-4d7b-a81e-9217d69613fd-kube-api-access-44xxc\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.053752 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.053805 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed45484d-165d-4b13-8093-5a0ea05d4502-config\") pod \"dnsmasq-dns-c44667757-vpzm4\" (UID: \"ed45484d-165d-4b13-8093-5a0ea05d4502\") " pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.053894 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znqgd\" (UniqueName: \"kubernetes.io/projected/ed45484d-165d-4b13-8093-5a0ea05d4502-kube-api-access-znqgd\") pod \"dnsmasq-dns-c44667757-vpzm4\" (UID: \"ed45484d-165d-4b13-8093-5a0ea05d4502\") " pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.054514 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-config\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.054694 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.054765 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed45484d-165d-4b13-8093-5a0ea05d4502-config\") pod \"dnsmasq-dns-c44667757-vpzm4\" (UID: \"ed45484d-165d-4b13-8093-5a0ea05d4502\") " pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.076365 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44xxc\" (UniqueName: \"kubernetes.io/projected/74bed246-fedc-4d7b-a81e-9217d69613fd-kube-api-access-44xxc\") pod \"dnsmasq-dns-55c76fd6b7-kw4sz\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.078986 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znqgd\" (UniqueName: \"kubernetes.io/projected/ed45484d-165d-4b13-8093-5a0ea05d4502-kube-api-access-znqgd\") pod \"dnsmasq-dns-c44667757-vpzm4\" (UID: \"ed45484d-165d-4b13-8093-5a0ea05d4502\") " pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.089828 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-kw4sz"] Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.090727 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.118538 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-gcrhn"] Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.121407 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.138207 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-gcrhn"] Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.154812 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l78x5\" (UniqueName: \"kubernetes.io/projected/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-kube-api-access-l78x5\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.154881 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-config\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.155163 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.203161 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.256070 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-config\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.256171 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.256201 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l78x5\" (UniqueName: \"kubernetes.io/projected/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-kube-api-access-l78x5\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.257226 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-config\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.257226 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.276340 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l78x5\" (UniqueName: \"kubernetes.io/projected/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-kube-api-access-l78x5\") pod \"dnsmasq-dns-5fb77f9685-gcrhn\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.431194 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-vpzm4"] Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.459243 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-mvfqs"] Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.461553 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.471704 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.480291 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-mvfqs"] Mar 10 23:14:26 crc kubenswrapper[4919]: W0310 23:14:26.526976 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74bed246_fedc_4d7b_a81e_9217d69613fd.slice/crio-52ce147d6e9bd12f9196abf54fb267bca726937a71864e457514130ba97cd077 WatchSource:0}: Error finding container 52ce147d6e9bd12f9196abf54fb267bca726937a71864e457514130ba97cd077: Status 404 returned error can't find the container with id 52ce147d6e9bd12f9196abf54fb267bca726937a71864e457514130ba97cd077 Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.527590 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-kw4sz"] Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.560061 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-dns-svc\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.560161 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgl4\" (UniqueName: \"kubernetes.io/projected/9080a849-0d02-42dd-9cbb-4b6f29055ad1-kube-api-access-8pgl4\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.560191 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-config\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.661163 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgl4\" (UniqueName: \"kubernetes.io/projected/9080a849-0d02-42dd-9cbb-4b6f29055ad1-kube-api-access-8pgl4\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.661229 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-config\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.661278 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-dns-svc\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.661989 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-dns-svc\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.662716 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-config\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.689110 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgl4\" (UniqueName: \"kubernetes.io/projected/9080a849-0d02-42dd-9cbb-4b6f29055ad1-kube-api-access-8pgl4\") pod \"dnsmasq-dns-ff89b6977-mvfqs\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.794785 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.839637 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-vpzm4"] Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.956864 4919 generic.go:334] "Generic (PLEG): container finished" podID="74bed246-fedc-4d7b-a81e-9217d69613fd" containerID="15e8fc5a830419e87e0f45e5b0b27bff91ef51272397644486a349139afe6da3" exitCode=0 Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.956938 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" event={"ID":"74bed246-fedc-4d7b-a81e-9217d69613fd","Type":"ContainerDied","Data":"15e8fc5a830419e87e0f45e5b0b27bff91ef51272397644486a349139afe6da3"} Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.957256 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" event={"ID":"74bed246-fedc-4d7b-a81e-9217d69613fd","Type":"ContainerStarted","Data":"52ce147d6e9bd12f9196abf54fb267bca726937a71864e457514130ba97cd077"} Mar 10 23:14:26 crc kubenswrapper[4919]: I0310 23:14:26.962951 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-vpzm4" event={"ID":"ed45484d-165d-4b13-8093-5a0ea05d4502","Type":"ContainerStarted","Data":"9e154b38a54d9a6e99e657452ac54b0230a5113cec5bce08523fbd4b61a92861"} Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.185962 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-gcrhn"] Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.245353 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.263157 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:14:27 crc kubenswrapper[4919]: E0310 23:14:27.263991 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bed246-fedc-4d7b-a81e-9217d69613fd" containerName="init" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.264012 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bed246-fedc-4d7b-a81e-9217d69613fd" containerName="init" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.264153 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="74bed246-fedc-4d7b-a81e-9217d69613fd" containerName="init" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.264976 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.269972 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.270216 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.270342 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.270465 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.270571 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2px2p" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.270653 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.270803 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.278781 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-dns-svc\") pod \"74bed246-fedc-4d7b-a81e-9217d69613fd\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.278861 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44xxc\" (UniqueName: \"kubernetes.io/projected/74bed246-fedc-4d7b-a81e-9217d69613fd-kube-api-access-44xxc\") pod \"74bed246-fedc-4d7b-a81e-9217d69613fd\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.278970 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-config\") pod \"74bed246-fedc-4d7b-a81e-9217d69613fd\" (UID: \"74bed246-fedc-4d7b-a81e-9217d69613fd\") " Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.292702 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74bed246-fedc-4d7b-a81e-9217d69613fd-kube-api-access-44xxc" (OuterVolumeSpecName: "kube-api-access-44xxc") pod "74bed246-fedc-4d7b-a81e-9217d69613fd" (UID: "74bed246-fedc-4d7b-a81e-9217d69613fd"). InnerVolumeSpecName "kube-api-access-44xxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.298435 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.305992 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74bed246-fedc-4d7b-a81e-9217d69613fd" (UID: "74bed246-fedc-4d7b-a81e-9217d69613fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.307717 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-config" (OuterVolumeSpecName: "config") pod "74bed246-fedc-4d7b-a81e-9217d69613fd" (UID: "74bed246-fedc-4d7b-a81e-9217d69613fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.316789 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-mvfqs"] Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.380482 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.380876 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08f1d385-c9ed-4616-b201-0234049fa538-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381017 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsb7s\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-kube-api-access-lsb7s\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381052 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381080 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381106 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381128 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381157 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381214 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08f1d385-c9ed-4616-b201-0234049fa538-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381260 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.381294 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.385890 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-config\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.385933 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bed246-fedc-4d7b-a81e-9217d69613fd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.385947 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44xxc\" (UniqueName: \"kubernetes.io/projected/74bed246-fedc-4d7b-a81e-9217d69613fd-kube-api-access-44xxc\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486696 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486749 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486772 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486791 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08f1d385-c9ed-4616-b201-0234049fa538-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486846 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsb7s\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-kube-api-access-lsb7s\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486872 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486894 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486909 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486932 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486955 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.486996 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08f1d385-c9ed-4616-b201-0234049fa538-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.488284 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.488612 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.489027 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.489285 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.489817 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.490994 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08f1d385-c9ed-4616-b201-0234049fa538-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.491587 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.492504 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.493411 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.493451 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3abe716c7a40e2495d9b84996c37ea2d6664c72a27ff15e380d09b03097a86e7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.494835 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08f1d385-c9ed-4616-b201-0234049fa538-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.512057 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsb7s\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-kube-api-access-lsb7s\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.539482 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"rabbitmq-cell1-server-0\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.601570 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.602776 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.604744 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.605254 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.605429 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.605566 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.605676 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.605784 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-f9tl2" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.607667 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.636355 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.659857 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689732 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689783 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689803 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5b3447d-7136-4c2b-bd66-18e26e7a157e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689835 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689874 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689904 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689922 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5b3447d-7136-4c2b-bd66-18e26e7a157e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689944 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689971 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.689998 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7t24\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-kube-api-access-f7t24\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.690015 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.791905 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.791953 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7t24\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-kube-api-access-f7t24\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.791976 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.791994 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.792025 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.792045 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5b3447d-7136-4c2b-bd66-18e26e7a157e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.792064 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.792097 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.792128 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.792146 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5b3447d-7136-4c2b-bd66-18e26e7a157e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.792165 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.793122 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.793614 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.794663 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.795475 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.795684 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5b3447d-7136-4c2b-bd66-18e26e7a157e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.797058 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.797185 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.798957 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5b3447d-7136-4c2b-bd66-18e26e7a157e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.799272 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.799353 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/931f8fb7a5c01cf1cb7d2760c8569b187e78d733dd4740fa54b0a4a2b50fa59c/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.803017 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.813496 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7t24\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-kube-api-access-f7t24\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.845329 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"rabbitmq-server-0\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.919826 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.971538 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" event={"ID":"74bed246-fedc-4d7b-a81e-9217d69613fd","Type":"ContainerDied","Data":"52ce147d6e9bd12f9196abf54fb267bca726937a71864e457514130ba97cd077"} Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.971571 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-kw4sz" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.971597 4919 scope.go:117] "RemoveContainer" containerID="15e8fc5a830419e87e0f45e5b0b27bff91ef51272397644486a349139afe6da3" Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.975074 4919 generic.go:334] "Generic (PLEG): container finished" podID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" containerID="d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0" exitCode=0 Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.975169 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" event={"ID":"221bd2a2-46fa-44a6-a49d-4d21f0dc396e","Type":"ContainerDied","Data":"d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0"} Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.975204 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" event={"ID":"221bd2a2-46fa-44a6-a49d-4d21f0dc396e","Type":"ContainerStarted","Data":"0d654d33522af7b60ed768d2d2e709eca4a41ad463987801c17877f4339e7f15"} Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.976836 4919 generic.go:334] "Generic (PLEG): container finished" podID="ed45484d-165d-4b13-8093-5a0ea05d4502" containerID="c09fd9d1033e498da4c7dfdc26804ab3c981aa66c5e00decacfab8d62cb5f374" exitCode=0 Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.976915 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-vpzm4" event={"ID":"ed45484d-165d-4b13-8093-5a0ea05d4502","Type":"ContainerDied","Data":"c09fd9d1033e498da4c7dfdc26804ab3c981aa66c5e00decacfab8d62cb5f374"} Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.984061 4919 generic.go:334] "Generic (PLEG): container finished" podID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" containerID="74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434" exitCode=0 Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.984103 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" event={"ID":"9080a849-0d02-42dd-9cbb-4b6f29055ad1","Type":"ContainerDied","Data":"74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434"} Mar 10 23:14:27 crc kubenswrapper[4919]: I0310 23:14:27.984132 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" event={"ID":"9080a849-0d02-42dd-9cbb-4b6f29055ad1","Type":"ContainerStarted","Data":"d09f4b7fe63949c0b307805dbf20814e4f86b625fe43a57cdc6a98e520699048"} Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.025142 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-kw4sz"] Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.035924 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-kw4sz"] Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.173972 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:14:28 crc kubenswrapper[4919]: E0310 23:14:28.311048 4919 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 10 23:14:28 crc kubenswrapper[4919]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/221bd2a2-46fa-44a6-a49d-4d21f0dc396e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 23:14:28 crc kubenswrapper[4919]: > podSandboxID="0d654d33522af7b60ed768d2d2e709eca4a41ad463987801c17877f4339e7f15" Mar 10 23:14:28 crc kubenswrapper[4919]: E0310 23:14:28.311373 4919 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 23:14:28 crc kubenswrapper[4919]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb6hc5h68h68h594h659hdbh679h65ch5f6hdch6h5b9h8fh55hfhf8h57fhc7h56ch687h669h559h678h5dhc7hf7h697h5d6h9ch669h54fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l78x5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5fb77f9685-gcrhn_openstack(221bd2a2-46fa-44a6-a49d-4d21f0dc396e): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/221bd2a2-46fa-44a6-a49d-4d21f0dc396e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 23:14:28 crc kubenswrapper[4919]: > logger="UnhandledError" Mar 10 23:14:28 crc kubenswrapper[4919]: E0310 23:14:28.313040 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/221bd2a2-46fa-44a6-a49d-4d21f0dc396e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" podUID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.385957 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.403340 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:14:28 crc kubenswrapper[4919]: W0310 23:14:28.406637 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5b3447d_7136_4c2b_bd66_18e26e7a157e.slice/crio-b9b05ed411750745ac7102a308f28b8f514c0caa506d31129b795adfdfdcf1f9 WatchSource:0}: Error finding container b9b05ed411750745ac7102a308f28b8f514c0caa506d31129b795adfdfdcf1f9: Status 404 returned error can't find the container with id b9b05ed411750745ac7102a308f28b8f514c0caa506d31129b795adfdfdcf1f9 Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.511340 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znqgd\" (UniqueName: \"kubernetes.io/projected/ed45484d-165d-4b13-8093-5a0ea05d4502-kube-api-access-znqgd\") pod \"ed45484d-165d-4b13-8093-5a0ea05d4502\" (UID: \"ed45484d-165d-4b13-8093-5a0ea05d4502\") " Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.511613 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed45484d-165d-4b13-8093-5a0ea05d4502-config\") pod \"ed45484d-165d-4b13-8093-5a0ea05d4502\" (UID: \"ed45484d-165d-4b13-8093-5a0ea05d4502\") " Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.515225 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed45484d-165d-4b13-8093-5a0ea05d4502-kube-api-access-znqgd" (OuterVolumeSpecName: "kube-api-access-znqgd") pod "ed45484d-165d-4b13-8093-5a0ea05d4502" (UID: "ed45484d-165d-4b13-8093-5a0ea05d4502"). InnerVolumeSpecName "kube-api-access-znqgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.527242 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed45484d-165d-4b13-8093-5a0ea05d4502-config" (OuterVolumeSpecName: "config") pod "ed45484d-165d-4b13-8093-5a0ea05d4502" (UID: "ed45484d-165d-4b13-8093-5a0ea05d4502"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.613561 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed45484d-165d-4b13-8093-5a0ea05d4502-config\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.613599 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znqgd\" (UniqueName: \"kubernetes.io/projected/ed45484d-165d-4b13-8093-5a0ea05d4502-kube-api-access-znqgd\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.706821 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 23:14:28 crc kubenswrapper[4919]: E0310 23:14:28.707753 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed45484d-165d-4b13-8093-5a0ea05d4502" containerName="init" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.707954 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed45484d-165d-4b13-8093-5a0ea05d4502" containerName="init" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.709099 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed45484d-165d-4b13-8093-5a0ea05d4502" containerName="init" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.710742 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.714779 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zdmkq" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.715729 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.718014 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.757344 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.760173 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.762648 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.816626 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8137bb36-2761-459f-a700-3c497dbe0937-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.816671 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8137bb36-2761-459f-a700-3c497dbe0937-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.816689 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.816744 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbdtb\" (UniqueName: \"kubernetes.io/projected/8137bb36-2761-459f-a700-3c497dbe0937-kube-api-access-vbdtb\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.816802 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5851f81b-4812-42a7-ab79-4465eedf7e7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5851f81b-4812-42a7-ab79-4465eedf7e7d\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.816825 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8137bb36-2761-459f-a700-3c497dbe0937-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.816843 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-kolla-config\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.816867 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-config-data-default\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.918111 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-kolla-config\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.918170 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-config-data-default\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.918202 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8137bb36-2761-459f-a700-3c497dbe0937-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.918221 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8137bb36-2761-459f-a700-3c497dbe0937-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.918236 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.918269 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbdtb\" (UniqueName: \"kubernetes.io/projected/8137bb36-2761-459f-a700-3c497dbe0937-kube-api-access-vbdtb\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.918324 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5851f81b-4812-42a7-ab79-4465eedf7e7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5851f81b-4812-42a7-ab79-4465eedf7e7d\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.918349 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8137bb36-2761-459f-a700-3c497dbe0937-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.919328 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8137bb36-2761-459f-a700-3c497dbe0937-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.919563 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-kolla-config\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.919764 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-config-data-default\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.920733 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8137bb36-2761-459f-a700-3c497dbe0937-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.922523 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8137bb36-2761-459f-a700-3c497dbe0937-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.923916 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8137bb36-2761-459f-a700-3c497dbe0937-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.929839 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.929874 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5851f81b-4812-42a7-ab79-4465eedf7e7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5851f81b-4812-42a7-ab79-4465eedf7e7d\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2e4a66c141c5b72d82936edb6c6dd55e534562e6903bd7a1337dcb970b51d777/globalmount\"" pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.939753 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbdtb\" (UniqueName: \"kubernetes.io/projected/8137bb36-2761-459f-a700-3c497dbe0937-kube-api-access-vbdtb\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:28 crc kubenswrapper[4919]: I0310 23:14:28.955631 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5851f81b-4812-42a7-ab79-4465eedf7e7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5851f81b-4812-42a7-ab79-4465eedf7e7d\") pod \"openstack-galera-0\" (UID: \"8137bb36-2761-459f-a700-3c497dbe0937\") " pod="openstack/openstack-galera-0" Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.001799 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-vpzm4" event={"ID":"ed45484d-165d-4b13-8093-5a0ea05d4502","Type":"ContainerDied","Data":"9e154b38a54d9a6e99e657452ac54b0230a5113cec5bce08523fbd4b61a92861"} Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.001822 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-vpzm4" Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.001888 4919 scope.go:117] "RemoveContainer" containerID="c09fd9d1033e498da4c7dfdc26804ab3c981aa66c5e00decacfab8d62cb5f374" Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.013648 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" event={"ID":"9080a849-0d02-42dd-9cbb-4b6f29055ad1","Type":"ContainerStarted","Data":"157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059"} Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.013711 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.018940 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5b3447d-7136-4c2b-bd66-18e26e7a157e","Type":"ContainerStarted","Data":"b9b05ed411750745ac7102a308f28b8f514c0caa506d31129b795adfdfdcf1f9"} Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.026935 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08f1d385-c9ed-4616-b201-0234049fa538","Type":"ContainerStarted","Data":"e4b70775fe46ffccfe8be626fb50dcbb85af911a84e9a2c394a2849494392f52"} Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.066485 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" podStartSLOduration=3.066460323 podStartE2EDuration="3.066460323s" podCreationTimestamp="2026-03-10 23:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:14:29.046333705 +0000 UTC m=+5056.288214313" watchObservedRunningTime="2026-03-10 23:14:29.066460323 +0000 UTC m=+5056.308340931" Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.074965 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.146444 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-vpzm4"] Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.150071 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c44667757-vpzm4"] Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.391669 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.511570 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74bed246-fedc-4d7b-a81e-9217d69613fd" path="/var/lib/kubelet/pods/74bed246-fedc-4d7b-a81e-9217d69613fd/volumes" Mar 10 23:14:29 crc kubenswrapper[4919]: I0310 23:14:29.512944 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed45484d-165d-4b13-8093-5a0ea05d4502" path="/var/lib/kubelet/pods/ed45484d-165d-4b13-8093-5a0ea05d4502/volumes" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.037427 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5b3447d-7136-4c2b-bd66-18e26e7a157e","Type":"ContainerStarted","Data":"a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d"} Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.039086 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08f1d385-c9ed-4616-b201-0234049fa538","Type":"ContainerStarted","Data":"eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa"} Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.041905 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" event={"ID":"221bd2a2-46fa-44a6-a49d-4d21f0dc396e","Type":"ContainerStarted","Data":"9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208"} Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.042091 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.046543 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8137bb36-2761-459f-a700-3c497dbe0937","Type":"ContainerStarted","Data":"d570d49111087886aad764e6bfbfad665c17077dcbfebd0b8b87e52e52258e00"} Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.046575 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8137bb36-2761-459f-a700-3c497dbe0937","Type":"ContainerStarted","Data":"6b6ce9b4f4e9576f4e53e69d1ee4f5c941f20afdd592bab41ce75d10a14fd00c"} Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.114839 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" podStartSLOduration=4.114821341 podStartE2EDuration="4.114821341s" podCreationTimestamp="2026-03-10 23:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:14:30.107295335 +0000 UTC m=+5057.349175953" watchObservedRunningTime="2026-03-10 23:14:30.114821341 +0000 UTC m=+5057.356701939" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.150177 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.151333 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.153889 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-zrj7b" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.153908 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.160297 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.160860 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.164742 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.350325 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3d38d6-13a6-4aeb-850c-96c069d15e64-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.350373 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d3d38d6-13a6-4aeb-850c-96c069d15e64-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.350404 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6d3d38d6-13a6-4aeb-850c-96c069d15e64-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.350431 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.350490 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htl4q\" (UniqueName: \"kubernetes.io/projected/6d3d38d6-13a6-4aeb-850c-96c069d15e64-kube-api-access-htl4q\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.350509 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.350543 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8f52cdfb-958a-4f60-9560-7a77a9716151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f52cdfb-958a-4f60-9560-7a77a9716151\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.350568 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.387426 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.388380 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.390355 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.390693 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.391786 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hdxfp" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.406982 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.452784 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8f52cdfb-958a-4f60-9560-7a77a9716151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f52cdfb-958a-4f60-9560-7a77a9716151\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.452892 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.452944 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3d38d6-13a6-4aeb-850c-96c069d15e64-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.452968 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d3d38d6-13a6-4aeb-850c-96c069d15e64-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.452995 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6d3d38d6-13a6-4aeb-850c-96c069d15e64-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.453026 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.453093 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htl4q\" (UniqueName: \"kubernetes.io/projected/6d3d38d6-13a6-4aeb-850c-96c069d15e64-kube-api-access-htl4q\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.453121 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.453512 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6d3d38d6-13a6-4aeb-850c-96c069d15e64-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.454733 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.456356 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.456386 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8f52cdfb-958a-4f60-9560-7a77a9716151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f52cdfb-958a-4f60-9560-7a77a9716151\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/297a4eb176c884ab5e5e88e71a04f94b5d8914f1bbf309854221783c8f90518d/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.456407 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.457270 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6d3d38d6-13a6-4aeb-850c-96c069d15e64-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.458091 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d3d38d6-13a6-4aeb-850c-96c069d15e64-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.463352 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3d38d6-13a6-4aeb-850c-96c069d15e64-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.470227 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htl4q\" (UniqueName: \"kubernetes.io/projected/6d3d38d6-13a6-4aeb-850c-96c069d15e64-kube-api-access-htl4q\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.498635 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8f52cdfb-958a-4f60-9560-7a77a9716151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f52cdfb-958a-4f60-9560-7a77a9716151\") pod \"openstack-cell1-galera-0\" (UID: \"6d3d38d6-13a6-4aeb-850c-96c069d15e64\") " pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.554199 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.554246 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-config-data\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.554262 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r88n2\" (UniqueName: \"kubernetes.io/projected/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-kube-api-access-r88n2\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.554306 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-kolla-config\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.554375 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.655369 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.655548 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-config-data\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.655575 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.655605 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r88n2\" (UniqueName: \"kubernetes.io/projected/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-kube-api-access-r88n2\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.656668 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-kolla-config\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.657776 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-config-data\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.657838 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-kolla-config\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.660113 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.660308 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.670636 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r88n2\" (UniqueName: \"kubernetes.io/projected/be3159e3-fc48-4a94-a13f-6f179e5d8ad9-kube-api-access-r88n2\") pod \"memcached-0\" (UID: \"be3159e3-fc48-4a94-a13f-6f179e5d8ad9\") " pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.704713 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 23:14:30 crc kubenswrapper[4919]: I0310 23:14:30.767770 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:31 crc kubenswrapper[4919]: I0310 23:14:31.128912 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 23:14:31 crc kubenswrapper[4919]: W0310 23:14:31.132679 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe3159e3_fc48_4a94_a13f_6f179e5d8ad9.slice/crio-59e084c6ded6f48dddd620b9443f6372b56f29068714c9176603aad987bc43b4 WatchSource:0}: Error finding container 59e084c6ded6f48dddd620b9443f6372b56f29068714c9176603aad987bc43b4: Status 404 returned error can't find the container with id 59e084c6ded6f48dddd620b9443f6372b56f29068714c9176603aad987bc43b4 Mar 10 23:14:31 crc kubenswrapper[4919]: I0310 23:14:31.290611 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 23:14:31 crc kubenswrapper[4919]: W0310 23:14:31.298456 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d3d38d6_13a6_4aeb_850c_96c069d15e64.slice/crio-3380c59c09e3b9cd98632523d0958b9be7e3d00978b02947435808d819c7a319 WatchSource:0}: Error finding container 3380c59c09e3b9cd98632523d0958b9be7e3d00978b02947435808d819c7a319: Status 404 returned error can't find the container with id 3380c59c09e3b9cd98632523d0958b9be7e3d00978b02947435808d819c7a319 Mar 10 23:14:32 crc kubenswrapper[4919]: I0310 23:14:32.063274 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"be3159e3-fc48-4a94-a13f-6f179e5d8ad9","Type":"ContainerStarted","Data":"b4db00e4831617c391796ac6575b263975b5335181adef70cd4afd2d2241853e"} Mar 10 23:14:32 crc kubenswrapper[4919]: I0310 23:14:32.063960 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 23:14:32 crc kubenswrapper[4919]: I0310 23:14:32.064069 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"be3159e3-fc48-4a94-a13f-6f179e5d8ad9","Type":"ContainerStarted","Data":"59e084c6ded6f48dddd620b9443f6372b56f29068714c9176603aad987bc43b4"} Mar 10 23:14:32 crc kubenswrapper[4919]: I0310 23:14:32.064778 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6d3d38d6-13a6-4aeb-850c-96c069d15e64","Type":"ContainerStarted","Data":"c586aa0fc14340eee49ebd69bd2d8f7efb9596a77f9b07c25b5a9cc866251997"} Mar 10 23:14:32 crc kubenswrapper[4919]: I0310 23:14:32.064949 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6d3d38d6-13a6-4aeb-850c-96c069d15e64","Type":"ContainerStarted","Data":"3380c59c09e3b9cd98632523d0958b9be7e3d00978b02947435808d819c7a319"} Mar 10 23:14:32 crc kubenswrapper[4919]: I0310 23:14:32.086347 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.086321973 podStartE2EDuration="2.086321973s" podCreationTimestamp="2026-03-10 23:14:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:14:32.079203309 +0000 UTC m=+5059.321083957" watchObservedRunningTime="2026-03-10 23:14:32.086321973 +0000 UTC m=+5059.328202621" Mar 10 23:14:34 crc kubenswrapper[4919]: I0310 23:14:34.081440 4919 generic.go:334] "Generic (PLEG): container finished" podID="8137bb36-2761-459f-a700-3c497dbe0937" containerID="d570d49111087886aad764e6bfbfad665c17077dcbfebd0b8b87e52e52258e00" exitCode=0 Mar 10 23:14:34 crc kubenswrapper[4919]: I0310 23:14:34.081521 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8137bb36-2761-459f-a700-3c497dbe0937","Type":"ContainerDied","Data":"d570d49111087886aad764e6bfbfad665c17077dcbfebd0b8b87e52e52258e00"} Mar 10 23:14:35 crc kubenswrapper[4919]: I0310 23:14:35.585021 4919 generic.go:334] "Generic (PLEG): container finished" podID="6d3d38d6-13a6-4aeb-850c-96c069d15e64" containerID="c586aa0fc14340eee49ebd69bd2d8f7efb9596a77f9b07c25b5a9cc866251997" exitCode=0 Mar 10 23:14:35 crc kubenswrapper[4919]: I0310 23:14:35.585114 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6d3d38d6-13a6-4aeb-850c-96c069d15e64","Type":"ContainerDied","Data":"c586aa0fc14340eee49ebd69bd2d8f7efb9596a77f9b07c25b5a9cc866251997"} Mar 10 23:14:35 crc kubenswrapper[4919]: I0310 23:14:35.592641 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8137bb36-2761-459f-a700-3c497dbe0937","Type":"ContainerStarted","Data":"e69e5822378077818a474e7ed176ba7fa429bc28a0bc342f73b85b26da305d0a"} Mar 10 23:14:35 crc kubenswrapper[4919]: I0310 23:14:35.685359 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.685324054 podStartE2EDuration="8.685324054s" podCreationTimestamp="2026-03-10 23:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:14:35.661039542 +0000 UTC m=+5062.902920160" watchObservedRunningTime="2026-03-10 23:14:35.685324054 +0000 UTC m=+5062.927204672" Mar 10 23:14:36 crc kubenswrapper[4919]: I0310 23:14:36.473553 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:36 crc kubenswrapper[4919]: I0310 23:14:36.605786 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6d3d38d6-13a6-4aeb-850c-96c069d15e64","Type":"ContainerStarted","Data":"c85113f0aa7afd7ec00f3c19a05a2ce3ff6779ba6dc977496237a288e35e46cc"} Mar 10 23:14:36 crc kubenswrapper[4919]: I0310 23:14:36.636072 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.636049 podStartE2EDuration="7.636049s" podCreationTimestamp="2026-03-10 23:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:14:36.628801353 +0000 UTC m=+5063.870681971" watchObservedRunningTime="2026-03-10 23:14:36.636049 +0000 UTC m=+5063.877929618" Mar 10 23:14:36 crc kubenswrapper[4919]: I0310 23:14:36.797819 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:14:36 crc kubenswrapper[4919]: I0310 23:14:36.845538 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-gcrhn"] Mar 10 23:14:36 crc kubenswrapper[4919]: I0310 23:14:36.845862 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" podUID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" containerName="dnsmasq-dns" containerID="cri-o://9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208" gracePeriod=10 Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.387453 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.473249 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l78x5\" (UniqueName: \"kubernetes.io/projected/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-kube-api-access-l78x5\") pod \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.473483 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-config\") pod \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.473551 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-dns-svc\") pod \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\" (UID: \"221bd2a2-46fa-44a6-a49d-4d21f0dc396e\") " Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.480754 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-kube-api-access-l78x5" (OuterVolumeSpecName: "kube-api-access-l78x5") pod "221bd2a2-46fa-44a6-a49d-4d21f0dc396e" (UID: "221bd2a2-46fa-44a6-a49d-4d21f0dc396e"). InnerVolumeSpecName "kube-api-access-l78x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.504819 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-config" (OuterVolumeSpecName: "config") pod "221bd2a2-46fa-44a6-a49d-4d21f0dc396e" (UID: "221bd2a2-46fa-44a6-a49d-4d21f0dc396e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.509682 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "221bd2a2-46fa-44a6-a49d-4d21f0dc396e" (UID: "221bd2a2-46fa-44a6-a49d-4d21f0dc396e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.575227 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-config\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.575259 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.575268 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l78x5\" (UniqueName: \"kubernetes.io/projected/221bd2a2-46fa-44a6-a49d-4d21f0dc396e-kube-api-access-l78x5\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.614067 4919 generic.go:334] "Generic (PLEG): container finished" podID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" containerID="9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208" exitCode=0 Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.614103 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.614113 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" event={"ID":"221bd2a2-46fa-44a6-a49d-4d21f0dc396e","Type":"ContainerDied","Data":"9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208"} Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.614142 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-gcrhn" event={"ID":"221bd2a2-46fa-44a6-a49d-4d21f0dc396e","Type":"ContainerDied","Data":"0d654d33522af7b60ed768d2d2e709eca4a41ad463987801c17877f4339e7f15"} Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.614161 4919 scope.go:117] "RemoveContainer" containerID="9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.635981 4919 scope.go:117] "RemoveContainer" containerID="d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.643592 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-gcrhn"] Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.649543 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-gcrhn"] Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.663120 4919 scope.go:117] "RemoveContainer" containerID="9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208" Mar 10 23:14:37 crc kubenswrapper[4919]: E0310 23:14:37.663876 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208\": container with ID starting with 9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208 not found: ID does not exist" containerID="9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.663910 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208"} err="failed to get container status \"9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208\": rpc error: code = NotFound desc = could not find container \"9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208\": container with ID starting with 9a81193acfdbe0c5c45ee13bc0b144d396b2aacd5fca0c9f21d158259c7ce208 not found: ID does not exist" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.663931 4919 scope.go:117] "RemoveContainer" containerID="d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0" Mar 10 23:14:37 crc kubenswrapper[4919]: E0310 23:14:37.664189 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0\": container with ID starting with d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0 not found: ID does not exist" containerID="d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0" Mar 10 23:14:37 crc kubenswrapper[4919]: I0310 23:14:37.664237 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0"} err="failed to get container status \"d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0\": rpc error: code = NotFound desc = could not find container \"d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0\": container with ID starting with d18f9a95648798331fdfa0323a47aa45d422f4d48a27ddbf4733664c79edd1c0 not found: ID does not exist" Mar 10 23:14:38 crc kubenswrapper[4919]: I0310 23:14:38.480436 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:14:38 crc kubenswrapper[4919]: E0310 23:14:38.481421 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:14:39 crc kubenswrapper[4919]: I0310 23:14:39.075761 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 23:14:39 crc kubenswrapper[4919]: I0310 23:14:39.075806 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 23:14:39 crc kubenswrapper[4919]: I0310 23:14:39.139967 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 23:14:39 crc kubenswrapper[4919]: I0310 23:14:39.490630 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" path="/var/lib/kubelet/pods/221bd2a2-46fa-44a6-a49d-4d21f0dc396e/volumes" Mar 10 23:14:39 crc kubenswrapper[4919]: I0310 23:14:39.711055 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 23:14:40 crc kubenswrapper[4919]: I0310 23:14:40.706601 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 23:14:40 crc kubenswrapper[4919]: I0310 23:14:40.768503 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:40 crc kubenswrapper[4919]: I0310 23:14:40.768562 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:40 crc kubenswrapper[4919]: I0310 23:14:40.858953 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:41 crc kubenswrapper[4919]: I0310 23:14:41.720225 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 23:14:46 crc kubenswrapper[4919]: I0310 23:14:46.549270 4919 scope.go:117] "RemoveContainer" containerID="eeca2a7898f2ae11e8e319b59e6ff80256559b0394751354bd8f5e008da20524" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.733439 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fb9p6"] Mar 10 23:14:47 crc kubenswrapper[4919]: E0310 23:14:47.734006 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" containerName="dnsmasq-dns" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.734032 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" containerName="dnsmasq-dns" Mar 10 23:14:47 crc kubenswrapper[4919]: E0310 23:14:47.734067 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" containerName="init" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.734083 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" containerName="init" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.734491 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="221bd2a2-46fa-44a6-a49d-4d21f0dc396e" containerName="dnsmasq-dns" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.735308 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.749648 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.754242 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fb9p6"] Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.842775 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znq2g\" (UniqueName: \"kubernetes.io/projected/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-kube-api-access-znq2g\") pod \"root-account-create-update-fb9p6\" (UID: \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\") " pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.843090 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-operator-scripts\") pod \"root-account-create-update-fb9p6\" (UID: \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\") " pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.944998 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znq2g\" (UniqueName: \"kubernetes.io/projected/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-kube-api-access-znq2g\") pod \"root-account-create-update-fb9p6\" (UID: \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\") " pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.945114 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-operator-scripts\") pod \"root-account-create-update-fb9p6\" (UID: \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\") " pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.945939 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-operator-scripts\") pod \"root-account-create-update-fb9p6\" (UID: \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\") " pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:47 crc kubenswrapper[4919]: I0310 23:14:47.966920 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znq2g\" (UniqueName: \"kubernetes.io/projected/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-kube-api-access-znq2g\") pod \"root-account-create-update-fb9p6\" (UID: \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\") " pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:48 crc kubenswrapper[4919]: I0310 23:14:48.109965 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:48 crc kubenswrapper[4919]: I0310 23:14:48.572813 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fb9p6"] Mar 10 23:14:48 crc kubenswrapper[4919]: W0310 23:14:48.581905 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48b60e49_8c81_4bc7_86d2_6ba2f0ac2d3f.slice/crio-23be3fd993146e10a368d5e19c7f2dfe4749e44cb394ab93f104772a8be742b8 WatchSource:0}: Error finding container 23be3fd993146e10a368d5e19c7f2dfe4749e44cb394ab93f104772a8be742b8: Status 404 returned error can't find the container with id 23be3fd993146e10a368d5e19c7f2dfe4749e44cb394ab93f104772a8be742b8 Mar 10 23:14:48 crc kubenswrapper[4919]: I0310 23:14:48.711065 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fb9p6" event={"ID":"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f","Type":"ContainerStarted","Data":"23be3fd993146e10a368d5e19c7f2dfe4749e44cb394ab93f104772a8be742b8"} Mar 10 23:14:49 crc kubenswrapper[4919]: I0310 23:14:49.719533 4919 generic.go:334] "Generic (PLEG): container finished" podID="48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f" containerID="b979cf7db5c663c1ded688e8317bb54c8ece3d27bce5d23f0cee13bd4b69c88a" exitCode=0 Mar 10 23:14:49 crc kubenswrapper[4919]: I0310 23:14:49.719578 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fb9p6" event={"ID":"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f","Type":"ContainerDied","Data":"b979cf7db5c663c1ded688e8317bb54c8ece3d27bce5d23f0cee13bd4b69c88a"} Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.058195 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.099926 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znq2g\" (UniqueName: \"kubernetes.io/projected/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-kube-api-access-znq2g\") pod \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\" (UID: \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\") " Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.100131 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-operator-scripts\") pod \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\" (UID: \"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f\") " Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.100931 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f" (UID: "48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.106514 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-kube-api-access-znq2g" (OuterVolumeSpecName: "kube-api-access-znq2g") pod "48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f" (UID: "48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f"). InnerVolumeSpecName "kube-api-access-znq2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.202594 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znq2g\" (UniqueName: \"kubernetes.io/projected/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-kube-api-access-znq2g\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.202636 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.733915 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fb9p6" event={"ID":"48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f","Type":"ContainerDied","Data":"23be3fd993146e10a368d5e19c7f2dfe4749e44cb394ab93f104772a8be742b8"} Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.733961 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23be3fd993146e10a368d5e19c7f2dfe4749e44cb394ab93f104772a8be742b8" Mar 10 23:14:51 crc kubenswrapper[4919]: I0310 23:14:51.734021 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fb9p6" Mar 10 23:14:52 crc kubenswrapper[4919]: I0310 23:14:52.481009 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:14:52 crc kubenswrapper[4919]: E0310 23:14:52.481567 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:14:54 crc kubenswrapper[4919]: I0310 23:14:54.151470 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fb9p6"] Mar 10 23:14:54 crc kubenswrapper[4919]: I0310 23:14:54.159050 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fb9p6"] Mar 10 23:14:55 crc kubenswrapper[4919]: I0310 23:14:55.490456 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f" path="/var/lib/kubelet/pods/48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f/volumes" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.162162 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-msh6h"] Mar 10 23:14:59 crc kubenswrapper[4919]: E0310 23:14:59.162667 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f" containerName="mariadb-account-create-update" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.162691 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f" containerName="mariadb-account-create-update" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.162928 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b60e49-8c81-4bc7-86d2-6ba2f0ac2d3f" containerName="mariadb-account-create-update" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.163634 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-msh6h" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.168190 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.171612 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-msh6h"] Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.227478 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d07324-1300-42ef-a68c-2dc6f9548352-operator-scripts\") pod \"root-account-create-update-msh6h\" (UID: \"a7d07324-1300-42ef-a68c-2dc6f9548352\") " pod="openstack/root-account-create-update-msh6h" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.227527 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7njcd\" (UniqueName: \"kubernetes.io/projected/a7d07324-1300-42ef-a68c-2dc6f9548352-kube-api-access-7njcd\") pod \"root-account-create-update-msh6h\" (UID: \"a7d07324-1300-42ef-a68c-2dc6f9548352\") " pod="openstack/root-account-create-update-msh6h" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.328855 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d07324-1300-42ef-a68c-2dc6f9548352-operator-scripts\") pod \"root-account-create-update-msh6h\" (UID: \"a7d07324-1300-42ef-a68c-2dc6f9548352\") " pod="openstack/root-account-create-update-msh6h" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.328904 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7njcd\" (UniqueName: \"kubernetes.io/projected/a7d07324-1300-42ef-a68c-2dc6f9548352-kube-api-access-7njcd\") pod \"root-account-create-update-msh6h\" (UID: \"a7d07324-1300-42ef-a68c-2dc6f9548352\") " pod="openstack/root-account-create-update-msh6h" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.329785 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d07324-1300-42ef-a68c-2dc6f9548352-operator-scripts\") pod \"root-account-create-update-msh6h\" (UID: \"a7d07324-1300-42ef-a68c-2dc6f9548352\") " pod="openstack/root-account-create-update-msh6h" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.348780 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7njcd\" (UniqueName: \"kubernetes.io/projected/a7d07324-1300-42ef-a68c-2dc6f9548352-kube-api-access-7njcd\") pod \"root-account-create-update-msh6h\" (UID: \"a7d07324-1300-42ef-a68c-2dc6f9548352\") " pod="openstack/root-account-create-update-msh6h" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.508655 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-msh6h" Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.771298 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-msh6h"] Mar 10 23:14:59 crc kubenswrapper[4919]: I0310 23:14:59.814856 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-msh6h" event={"ID":"a7d07324-1300-42ef-a68c-2dc6f9548352","Type":"ContainerStarted","Data":"c2144d70843c2be8dd6ea7b5366f8b0ca4b9e63cbb7d7df09d3f54f1e29e1903"} Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.132059 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz"] Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.133179 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.135377 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.135806 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.144284 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz"] Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.251160 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa53580-88ef-455a-b1a6-84c3d116dc0c-secret-volume\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.251595 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98hdj\" (UniqueName: \"kubernetes.io/projected/2aa53580-88ef-455a-b1a6-84c3d116dc0c-kube-api-access-98hdj\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.251634 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa53580-88ef-455a-b1a6-84c3d116dc0c-config-volume\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.353167 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa53580-88ef-455a-b1a6-84c3d116dc0c-secret-volume\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.353235 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98hdj\" (UniqueName: \"kubernetes.io/projected/2aa53580-88ef-455a-b1a6-84c3d116dc0c-kube-api-access-98hdj\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.353287 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa53580-88ef-455a-b1a6-84c3d116dc0c-config-volume\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.354223 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa53580-88ef-455a-b1a6-84c3d116dc0c-config-volume\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.365950 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa53580-88ef-455a-b1a6-84c3d116dc0c-secret-volume\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.368179 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98hdj\" (UniqueName: \"kubernetes.io/projected/2aa53580-88ef-455a-b1a6-84c3d116dc0c-kube-api-access-98hdj\") pod \"collect-profiles-29553075-25cgz\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.497227 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.823832 4919 generic.go:334] "Generic (PLEG): container finished" podID="a7d07324-1300-42ef-a68c-2dc6f9548352" containerID="d1e058a8a51f6561649c6d390051f8caceeb232836bb3428b88feb597fdd6fa2" exitCode=0 Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.823880 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-msh6h" event={"ID":"a7d07324-1300-42ef-a68c-2dc6f9548352","Type":"ContainerDied","Data":"d1e058a8a51f6561649c6d390051f8caceeb232836bb3428b88feb597fdd6fa2"} Mar 10 23:15:00 crc kubenswrapper[4919]: W0310 23:15:00.917363 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa53580_88ef_455a_b1a6_84c3d116dc0c.slice/crio-1b5174f5cfbc515f84743885ee893a669071917f63c14340dd08964e3d04230c WatchSource:0}: Error finding container 1b5174f5cfbc515f84743885ee893a669071917f63c14340dd08964e3d04230c: Status 404 returned error can't find the container with id 1b5174f5cfbc515f84743885ee893a669071917f63c14340dd08964e3d04230c Mar 10 23:15:00 crc kubenswrapper[4919]: I0310 23:15:00.923300 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz"] Mar 10 23:15:01 crc kubenswrapper[4919]: I0310 23:15:01.832169 4919 generic.go:334] "Generic (PLEG): container finished" podID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" containerID="a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d" exitCode=0 Mar 10 23:15:01 crc kubenswrapper[4919]: I0310 23:15:01.832254 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5b3447d-7136-4c2b-bd66-18e26e7a157e","Type":"ContainerDied","Data":"a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d"} Mar 10 23:15:01 crc kubenswrapper[4919]: I0310 23:15:01.834421 4919 generic.go:334] "Generic (PLEG): container finished" podID="08f1d385-c9ed-4616-b201-0234049fa538" containerID="eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa" exitCode=0 Mar 10 23:15:01 crc kubenswrapper[4919]: I0310 23:15:01.834510 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08f1d385-c9ed-4616-b201-0234049fa538","Type":"ContainerDied","Data":"eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa"} Mar 10 23:15:01 crc kubenswrapper[4919]: I0310 23:15:01.836863 4919 generic.go:334] "Generic (PLEG): container finished" podID="2aa53580-88ef-455a-b1a6-84c3d116dc0c" containerID="16dc7142f8bbf9ae9e84c5c3a09c5a6ab46fff61fa155d1baaf6eb396276d6c0" exitCode=0 Mar 10 23:15:01 crc kubenswrapper[4919]: I0310 23:15:01.836939 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" event={"ID":"2aa53580-88ef-455a-b1a6-84c3d116dc0c","Type":"ContainerDied","Data":"16dc7142f8bbf9ae9e84c5c3a09c5a6ab46fff61fa155d1baaf6eb396276d6c0"} Mar 10 23:15:01 crc kubenswrapper[4919]: I0310 23:15:01.836962 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" event={"ID":"2aa53580-88ef-455a-b1a6-84c3d116dc0c","Type":"ContainerStarted","Data":"1b5174f5cfbc515f84743885ee893a669071917f63c14340dd08964e3d04230c"} Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.107766 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-msh6h" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.288464 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d07324-1300-42ef-a68c-2dc6f9548352-operator-scripts\") pod \"a7d07324-1300-42ef-a68c-2dc6f9548352\" (UID: \"a7d07324-1300-42ef-a68c-2dc6f9548352\") " Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.288549 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7njcd\" (UniqueName: \"kubernetes.io/projected/a7d07324-1300-42ef-a68c-2dc6f9548352-kube-api-access-7njcd\") pod \"a7d07324-1300-42ef-a68c-2dc6f9548352\" (UID: \"a7d07324-1300-42ef-a68c-2dc6f9548352\") " Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.290230 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d07324-1300-42ef-a68c-2dc6f9548352-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7d07324-1300-42ef-a68c-2dc6f9548352" (UID: "a7d07324-1300-42ef-a68c-2dc6f9548352"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.297731 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d07324-1300-42ef-a68c-2dc6f9548352-kube-api-access-7njcd" (OuterVolumeSpecName: "kube-api-access-7njcd") pod "a7d07324-1300-42ef-a68c-2dc6f9548352" (UID: "a7d07324-1300-42ef-a68c-2dc6f9548352"). InnerVolumeSpecName "kube-api-access-7njcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.390989 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d07324-1300-42ef-a68c-2dc6f9548352-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.391033 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7njcd\" (UniqueName: \"kubernetes.io/projected/a7d07324-1300-42ef-a68c-2dc6f9548352-kube-api-access-7njcd\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.844739 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5b3447d-7136-4c2b-bd66-18e26e7a157e","Type":"ContainerStarted","Data":"3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2"} Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.844985 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.847035 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08f1d385-c9ed-4616-b201-0234049fa538","Type":"ContainerStarted","Data":"b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb"} Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.847509 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.849443 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-msh6h" event={"ID":"a7d07324-1300-42ef-a68c-2dc6f9548352","Type":"ContainerDied","Data":"c2144d70843c2be8dd6ea7b5366f8b0ca4b9e63cbb7d7df09d3f54f1e29e1903"} Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.849548 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2144d70843c2be8dd6ea7b5366f8b0ca4b9e63cbb7d7df09d3f54f1e29e1903" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.849486 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-msh6h" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.880792 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.880771125 podStartE2EDuration="36.880771125s" podCreationTimestamp="2026-03-10 23:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:15:02.879128149 +0000 UTC m=+5090.121008787" watchObservedRunningTime="2026-03-10 23:15:02.880771125 +0000 UTC m=+5090.122651733" Mar 10 23:15:02 crc kubenswrapper[4919]: I0310 23:15:02.913138 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.913123686 podStartE2EDuration="36.913123686s" podCreationTimestamp="2026-03-10 23:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:15:02.909564729 +0000 UTC m=+5090.151445357" watchObservedRunningTime="2026-03-10 23:15:02.913123686 +0000 UTC m=+5090.155004294" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.186245 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.304234 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa53580-88ef-455a-b1a6-84c3d116dc0c-secret-volume\") pod \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.304451 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa53580-88ef-455a-b1a6-84c3d116dc0c-config-volume\") pod \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.304478 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98hdj\" (UniqueName: \"kubernetes.io/projected/2aa53580-88ef-455a-b1a6-84c3d116dc0c-kube-api-access-98hdj\") pod \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\" (UID: \"2aa53580-88ef-455a-b1a6-84c3d116dc0c\") " Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.305030 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa53580-88ef-455a-b1a6-84c3d116dc0c-config-volume" (OuterVolumeSpecName: "config-volume") pod "2aa53580-88ef-455a-b1a6-84c3d116dc0c" (UID: "2aa53580-88ef-455a-b1a6-84c3d116dc0c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.308883 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa53580-88ef-455a-b1a6-84c3d116dc0c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2aa53580-88ef-455a-b1a6-84c3d116dc0c" (UID: "2aa53580-88ef-455a-b1a6-84c3d116dc0c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.309200 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa53580-88ef-455a-b1a6-84c3d116dc0c-kube-api-access-98hdj" (OuterVolumeSpecName: "kube-api-access-98hdj") pod "2aa53580-88ef-455a-b1a6-84c3d116dc0c" (UID: "2aa53580-88ef-455a-b1a6-84c3d116dc0c"). InnerVolumeSpecName "kube-api-access-98hdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.406478 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa53580-88ef-455a-b1a6-84c3d116dc0c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.406510 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98hdj\" (UniqueName: \"kubernetes.io/projected/2aa53580-88ef-455a-b1a6-84c3d116dc0c-kube-api-access-98hdj\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.406528 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa53580-88ef-455a-b1a6-84c3d116dc0c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.859597 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" event={"ID":"2aa53580-88ef-455a-b1a6-84c3d116dc0c","Type":"ContainerDied","Data":"1b5174f5cfbc515f84743885ee893a669071917f63c14340dd08964e3d04230c"} Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.859925 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b5174f5cfbc515f84743885ee893a669071917f63c14340dd08964e3d04230c" Mar 10 23:15:03 crc kubenswrapper[4919]: I0310 23:15:03.860003 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553075-25cgz" Mar 10 23:15:04 crc kubenswrapper[4919]: I0310 23:15:04.263228 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh"] Mar 10 23:15:04 crc kubenswrapper[4919]: I0310 23:15:04.268858 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553030-z5tsh"] Mar 10 23:15:05 crc kubenswrapper[4919]: I0310 23:15:05.490245 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0768ea-0564-4f66-8103-00f1652bab8e" path="/var/lib/kubelet/pods/dd0768ea-0564-4f66-8103-00f1652bab8e/volumes" Mar 10 23:15:07 crc kubenswrapper[4919]: I0310 23:15:07.481115 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:15:07 crc kubenswrapper[4919]: E0310 23:15:07.482903 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:15:17 crc kubenswrapper[4919]: I0310 23:15:17.663686 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:17 crc kubenswrapper[4919]: I0310 23:15:17.924689 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 23:15:21 crc kubenswrapper[4919]: I0310 23:15:21.480256 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:15:21 crc kubenswrapper[4919]: E0310 23:15:21.480908 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.866997 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-q568m"] Mar 10 23:15:22 crc kubenswrapper[4919]: E0310 23:15:22.867722 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d07324-1300-42ef-a68c-2dc6f9548352" containerName="mariadb-account-create-update" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.867740 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d07324-1300-42ef-a68c-2dc6f9548352" containerName="mariadb-account-create-update" Mar 10 23:15:22 crc kubenswrapper[4919]: E0310 23:15:22.867761 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa53580-88ef-455a-b1a6-84c3d116dc0c" containerName="collect-profiles" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.867770 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa53580-88ef-455a-b1a6-84c3d116dc0c" containerName="collect-profiles" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.867933 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d07324-1300-42ef-a68c-2dc6f9548352" containerName="mariadb-account-create-update" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.867948 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa53580-88ef-455a-b1a6-84c3d116dc0c" containerName="collect-profiles" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.868973 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.882748 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-q568m"] Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.970090 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.970200 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n4g7\" (UniqueName: \"kubernetes.io/projected/6b5a9723-b287-46d6-a9c7-136a560e3e38-kube-api-access-5n4g7\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:22 crc kubenswrapper[4919]: I0310 23:15:22.970284 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-config\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:23 crc kubenswrapper[4919]: I0310 23:15:23.071553 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n4g7\" (UniqueName: \"kubernetes.io/projected/6b5a9723-b287-46d6-a9c7-136a560e3e38-kube-api-access-5n4g7\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:23 crc kubenswrapper[4919]: I0310 23:15:23.071668 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-config\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:23 crc kubenswrapper[4919]: I0310 23:15:23.071719 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:23 crc kubenswrapper[4919]: I0310 23:15:23.072554 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-config\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:23 crc kubenswrapper[4919]: I0310 23:15:23.072793 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:23 crc kubenswrapper[4919]: I0310 23:15:23.088242 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n4g7\" (UniqueName: \"kubernetes.io/projected/6b5a9723-b287-46d6-a9c7-136a560e3e38-kube-api-access-5n4g7\") pod \"dnsmasq-dns-66d5bf7c87-q568m\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:23 crc kubenswrapper[4919]: I0310 23:15:23.188152 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:23 crc kubenswrapper[4919]: I0310 23:15:23.678022 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-q568m"] Mar 10 23:15:23 crc kubenswrapper[4919]: W0310 23:15:23.680887 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b5a9723_b287_46d6_a9c7_136a560e3e38.slice/crio-b2c7dbef18d4d750dc9278d95da8c47f1e853e4391aa894d24c50cc68e8f5983 WatchSource:0}: Error finding container b2c7dbef18d4d750dc9278d95da8c47f1e853e4391aa894d24c50cc68e8f5983: Status 404 returned error can't find the container with id b2c7dbef18d4d750dc9278d95da8c47f1e853e4391aa894d24c50cc68e8f5983 Mar 10 23:15:24 crc kubenswrapper[4919]: I0310 23:15:24.015540 4919 generic.go:334] "Generic (PLEG): container finished" podID="6b5a9723-b287-46d6-a9c7-136a560e3e38" containerID="da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa" exitCode=0 Mar 10 23:15:24 crc kubenswrapper[4919]: I0310 23:15:24.015645 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" event={"ID":"6b5a9723-b287-46d6-a9c7-136a560e3e38","Type":"ContainerDied","Data":"da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa"} Mar 10 23:15:24 crc kubenswrapper[4919]: I0310 23:15:24.016005 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" event={"ID":"6b5a9723-b287-46d6-a9c7-136a560e3e38","Type":"ContainerStarted","Data":"b2c7dbef18d4d750dc9278d95da8c47f1e853e4391aa894d24c50cc68e8f5983"} Mar 10 23:15:24 crc kubenswrapper[4919]: I0310 23:15:24.340802 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:15:24 crc kubenswrapper[4919]: I0310 23:15:24.423808 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:15:25 crc kubenswrapper[4919]: I0310 23:15:25.024542 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" event={"ID":"6b5a9723-b287-46d6-a9c7-136a560e3e38","Type":"ContainerStarted","Data":"39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40"} Mar 10 23:15:25 crc kubenswrapper[4919]: I0310 23:15:25.024657 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:25 crc kubenswrapper[4919]: I0310 23:15:25.057433 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" podStartSLOduration=3.057418675 podStartE2EDuration="3.057418675s" podCreationTimestamp="2026-03-10 23:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:15:25.05499358 +0000 UTC m=+5112.296874188" watchObservedRunningTime="2026-03-10 23:15:25.057418675 +0000 UTC m=+5112.299299283" Mar 10 23:15:28 crc kubenswrapper[4919]: I0310 23:15:28.284181 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="08f1d385-c9ed-4616-b201-0234049fa538" containerName="rabbitmq" containerID="cri-o://b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb" gracePeriod=604797 Mar 10 23:15:28 crc kubenswrapper[4919]: I0310 23:15:28.385528 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" containerName="rabbitmq" containerID="cri-o://3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2" gracePeriod=604796 Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.190818 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.285263 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-mvfqs"] Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.285631 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" podUID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" containerName="dnsmasq-dns" containerID="cri-o://157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059" gracePeriod=10 Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.753656 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.848955 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pgl4\" (UniqueName: \"kubernetes.io/projected/9080a849-0d02-42dd-9cbb-4b6f29055ad1-kube-api-access-8pgl4\") pod \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.849041 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-config\") pod \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.849144 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-dns-svc\") pod \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\" (UID: \"9080a849-0d02-42dd-9cbb-4b6f29055ad1\") " Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.865822 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9080a849-0d02-42dd-9cbb-4b6f29055ad1-kube-api-access-8pgl4" (OuterVolumeSpecName: "kube-api-access-8pgl4") pod "9080a849-0d02-42dd-9cbb-4b6f29055ad1" (UID: "9080a849-0d02-42dd-9cbb-4b6f29055ad1"). InnerVolumeSpecName "kube-api-access-8pgl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.881117 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9080a849-0d02-42dd-9cbb-4b6f29055ad1" (UID: "9080a849-0d02-42dd-9cbb-4b6f29055ad1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.887886 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-config" (OuterVolumeSpecName: "config") pod "9080a849-0d02-42dd-9cbb-4b6f29055ad1" (UID: "9080a849-0d02-42dd-9cbb-4b6f29055ad1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.951659 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-config\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.951693 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9080a849-0d02-42dd-9cbb-4b6f29055ad1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:33 crc kubenswrapper[4919]: I0310 23:15:33.951706 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pgl4\" (UniqueName: \"kubernetes.io/projected/9080a849-0d02-42dd-9cbb-4b6f29055ad1-kube-api-access-8pgl4\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.100185 4919 generic.go:334] "Generic (PLEG): container finished" podID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" containerID="157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059" exitCode=0 Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.100262 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" event={"ID":"9080a849-0d02-42dd-9cbb-4b6f29055ad1","Type":"ContainerDied","Data":"157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059"} Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.100302 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" event={"ID":"9080a849-0d02-42dd-9cbb-4b6f29055ad1","Type":"ContainerDied","Data":"d09f4b7fe63949c0b307805dbf20814e4f86b625fe43a57cdc6a98e520699048"} Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.100331 4919 scope.go:117] "RemoveContainer" containerID="157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.101432 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-mvfqs" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.134797 4919 scope.go:117] "RemoveContainer" containerID="74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.164529 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-mvfqs"] Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.178612 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-mvfqs"] Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.179910 4919 scope.go:117] "RemoveContainer" containerID="157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059" Mar 10 23:15:34 crc kubenswrapper[4919]: E0310 23:15:34.180480 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059\": container with ID starting with 157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059 not found: ID does not exist" containerID="157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.180523 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059"} err="failed to get container status \"157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059\": rpc error: code = NotFound desc = could not find container \"157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059\": container with ID starting with 157884d0fa01ebc69e5dfadc4a2076242cf67ed9b4830956c81f5dfdea466059 not found: ID does not exist" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.180549 4919 scope.go:117] "RemoveContainer" containerID="74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434" Mar 10 23:15:34 crc kubenswrapper[4919]: E0310 23:15:34.181030 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434\": container with ID starting with 74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434 not found: ID does not exist" containerID="74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.181166 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434"} err="failed to get container status \"74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434\": rpc error: code = NotFound desc = could not find container \"74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434\": container with ID starting with 74e3528ae26a8e0d9e22f98e0ed4fdfd2a6575916ef3986e352a8d4fc56ab434 not found: ID does not exist" Mar 10 23:15:34 crc kubenswrapper[4919]: E0310 23:15:34.308195 4919 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9080a849_0d02_42dd_9cbb_4b6f29055ad1.slice/crio-d09f4b7fe63949c0b307805dbf20814e4f86b625fe43a57cdc6a98e520699048\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9080a849_0d02_42dd_9cbb_4b6f29055ad1.slice\": RecentStats: unable to find data in memory cache]" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.900555 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.964934 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968361 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsb7s\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-kube-api-access-lsb7s\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968415 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08f1d385-c9ed-4616-b201-0234049fa538-pod-info\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968440 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-config-data\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968465 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-tls\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968495 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-confd\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968524 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-server-conf\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968586 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-plugins-conf\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968616 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08f1d385-c9ed-4616-b201-0234049fa538-erlang-cookie-secret\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968696 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-plugins\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968815 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.968860 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-erlang-cookie\") pod \"08f1d385-c9ed-4616-b201-0234049fa538\" (UID: \"08f1d385-c9ed-4616-b201-0234049fa538\") " Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.969981 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.972919 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.974577 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-kube-api-access-lsb7s" (OuterVolumeSpecName: "kube-api-access-lsb7s") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "kube-api-access-lsb7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.974115 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.977143 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/08f1d385-c9ed-4616-b201-0234049fa538-pod-info" (OuterVolumeSpecName: "pod-info") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.977274 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:34 crc kubenswrapper[4919]: I0310 23:15:34.980446 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f1d385-c9ed-4616-b201-0234049fa538-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.000720 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-config-data" (OuterVolumeSpecName: "config-data") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.002720 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2" (OuterVolumeSpecName: "persistence") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.027163 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-server-conf" (OuterVolumeSpecName: "server-conf") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.064577 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "08f1d385-c9ed-4616-b201-0234049fa538" (UID: "08f1d385-c9ed-4616-b201-0234049fa538"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070120 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5b3447d-7136-4c2b-bd66-18e26e7a157e-pod-info\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070172 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-server-conf\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070193 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5b3447d-7136-4c2b-bd66-18e26e7a157e-erlang-cookie-secret\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070270 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-plugins-conf\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070309 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7t24\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-kube-api-access-f7t24\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070338 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-plugins\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070356 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-tls\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070380 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-config-data\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070416 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-erlang-cookie\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070585 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070706 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-confd\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070962 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070978 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsb7s\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-kube-api-access-lsb7s\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.070988 4919 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08f1d385-c9ed-4616-b201-0234049fa538-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071000 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071009 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071019 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071028 4919 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071037 4919 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08f1d385-c9ed-4616-b201-0234049fa538-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071047 4919 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08f1d385-c9ed-4616-b201-0234049fa538-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071056 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08f1d385-c9ed-4616-b201-0234049fa538-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071081 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") on node \"crc\" " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071503 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071892 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.071993 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.074449 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d5b3447d-7136-4c2b-bd66-18e26e7a157e-pod-info" (OuterVolumeSpecName: "pod-info") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.074623 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-kube-api-access-f7t24" (OuterVolumeSpecName: "kube-api-access-f7t24") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "kube-api-access-f7t24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.077516 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.077528 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b3447d-7136-4c2b-bd66-18e26e7a157e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.092401 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325" (OuterVolumeSpecName: "persistence") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.094648 4919 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.094685 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-config-data" (OuterVolumeSpecName: "config-data") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.094935 4919 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2") on node "crc" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.109585 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-server-conf" (OuterVolumeSpecName: "server-conf") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.121144 4919 generic.go:334] "Generic (PLEG): container finished" podID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" containerID="3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2" exitCode=0 Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.121280 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5b3447d-7136-4c2b-bd66-18e26e7a157e","Type":"ContainerDied","Data":"3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2"} Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.121371 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5b3447d-7136-4c2b-bd66-18e26e7a157e","Type":"ContainerDied","Data":"b9b05ed411750745ac7102a308f28b8f514c0caa506d31129b795adfdfdcf1f9"} Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.121462 4919 scope.go:117] "RemoveContainer" containerID="3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.121634 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.129707 4919 generic.go:334] "Generic (PLEG): container finished" podID="08f1d385-c9ed-4616-b201-0234049fa538" containerID="b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb" exitCode=0 Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.129800 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08f1d385-c9ed-4616-b201-0234049fa538","Type":"ContainerDied","Data":"b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb"} Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.129891 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"08f1d385-c9ed-4616-b201-0234049fa538","Type":"ContainerDied","Data":"e4b70775fe46ffccfe8be626fb50dcbb85af911a84e9a2c394a2849494392f52"} Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.130002 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.167549 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.171225 4919 scope.go:117] "RemoveContainer" containerID="a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.171296 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.172097 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-confd\") pod \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\" (UID: \"d5b3447d-7136-4c2b-bd66-18e26e7a157e\") " Mar 10 23:15:35 crc kubenswrapper[4919]: W0310 23:15:35.172268 4919 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d5b3447d-7136-4c2b-bd66-18e26e7a157e/volumes/kubernetes.io~projected/rabbitmq-confd Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.172361 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d5b3447d-7136-4c2b-bd66-18e26e7a157e" (UID: "d5b3447d-7136-4c2b-bd66-18e26e7a157e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.172631 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.172754 4919 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5b3447d-7136-4c2b-bd66-18e26e7a157e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.172951 4919 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173027 4919 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5b3447d-7136-4c2b-bd66-18e26e7a157e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173087 4919 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173150 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7t24\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-kube-api-access-f7t24\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173205 4919 reconciler_common.go:293] "Volume detached for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173343 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173418 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173484 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5b3447d-7136-4c2b-bd66-18e26e7a157e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173542 4919 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5b3447d-7136-4c2b-bd66-18e26e7a157e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173614 4919 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") on node \"crc\" " Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.173461 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.195762 4919 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.195971 4919 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325") on node "crc" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.198238 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.198660 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f1d385-c9ed-4616-b201-0234049fa538" containerName="setup-container" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.198729 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f1d385-c9ed-4616-b201-0234049fa538" containerName="setup-container" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.198789 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" containerName="dnsmasq-dns" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.198893 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" containerName="dnsmasq-dns" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.198972 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" containerName="setup-container" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.199055 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" containerName="setup-container" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.199145 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" containerName="init" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.199217 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" containerName="init" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.199289 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f1d385-c9ed-4616-b201-0234049fa538" containerName="rabbitmq" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.199363 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f1d385-c9ed-4616-b201-0234049fa538" containerName="rabbitmq" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.199463 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" containerName="rabbitmq" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.199536 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" containerName="rabbitmq" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.199794 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" containerName="rabbitmq" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.199897 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f1d385-c9ed-4616-b201-0234049fa538" containerName="rabbitmq" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.199972 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" containerName="dnsmasq-dns" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.202931 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.208587 4919 scope.go:117] "RemoveContainer" containerID="3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.208872 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.209080 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.209206 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.209307 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.209336 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.209160 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2px2p" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.209620 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.209643 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2\": container with ID starting with 3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2 not found: ID does not exist" containerID="3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.209671 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2"} err="failed to get container status \"3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2\": rpc error: code = NotFound desc = could not find container \"3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2\": container with ID starting with 3e8c3962918fab87a19b7dfa8b706cbb5969c613c7ff3b9959efd5b1a0af6dc2 not found: ID does not exist" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.209695 4919 scope.go:117] "RemoveContainer" containerID="a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.209992 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d\": container with ID starting with a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d not found: ID does not exist" containerID="a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.210027 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d"} err="failed to get container status \"a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d\": rpc error: code = NotFound desc = could not find container \"a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d\": container with ID starting with a1feafe96622f2d3069b55ef120e7457a8ff0ed1826d9b9f66073436ca33017d not found: ID does not exist" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.210052 4919 scope.go:117] "RemoveContainer" containerID="b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.218167 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.235685 4919 scope.go:117] "RemoveContainer" containerID="eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.256933 4919 scope.go:117] "RemoveContainer" containerID="b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.257314 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb\": container with ID starting with b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb not found: ID does not exist" containerID="b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.257445 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb"} err="failed to get container status \"b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb\": rpc error: code = NotFound desc = could not find container \"b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb\": container with ID starting with b3a42f443ccc31c38bda0a67e2822484646b8bd333314001d0e9bb17c11248fb not found: ID does not exist" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.257556 4919 scope.go:117] "RemoveContainer" containerID="eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.257931 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa\": container with ID starting with eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa not found: ID does not exist" containerID="eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.257960 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa"} err="failed to get container status \"eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa\": rpc error: code = NotFound desc = could not find container \"eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa\": container with ID starting with eaa852c88b5ceb7e4121d56c05b283d9ddc1e39df0da4a8a863933118c3ac2aa not found: ID does not exist" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275146 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275198 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275231 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275251 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275384 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275476 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2101a481-5a05-4fe5-ae97-d3dd73ee8153-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275547 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275646 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw5jm\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-kube-api-access-sw5jm\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275710 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275759 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275800 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2101a481-5a05-4fe5-ae97-d3dd73ee8153-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.275920 4919 reconciler_common.go:293] "Volume detached for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") on node \"crc\" DevicePath \"\"" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377038 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377091 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2101a481-5a05-4fe5-ae97-d3dd73ee8153-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377163 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377199 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377231 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377271 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377316 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377342 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2101a481-5a05-4fe5-ae97-d3dd73ee8153-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377379 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377428 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw5jm\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-kube-api-access-sw5jm\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377457 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.377542 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.378560 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.378683 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.378860 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2101a481-5a05-4fe5-ae97-d3dd73ee8153-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.379030 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.381524 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.381655 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3abe716c7a40e2495d9b84996c37ea2d6664c72a27ff15e380d09b03097a86e7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.381928 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.382056 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.382437 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2101a481-5a05-4fe5-ae97-d3dd73ee8153-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.390884 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2101a481-5a05-4fe5-ae97-d3dd73ee8153-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.394049 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw5jm\" (UniqueName: \"kubernetes.io/projected/2101a481-5a05-4fe5-ae97-d3dd73ee8153-kube-api-access-sw5jm\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.411426 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9f5ddd6-654a-48f8-a8b1-ea03e8e2c7e2\") pod \"rabbitmq-cell1-server-0\" (UID: \"2101a481-5a05-4fe5-ae97-d3dd73ee8153\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.481843 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:15:35 crc kubenswrapper[4919]: E0310 23:15:35.482214 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.495588 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f1d385-c9ed-4616-b201-0234049fa538" path="/var/lib/kubelet/pods/08f1d385-c9ed-4616-b201-0234049fa538/volumes" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.496153 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9080a849-0d02-42dd-9cbb-4b6f29055ad1" path="/var/lib/kubelet/pods/9080a849-0d02-42dd-9cbb-4b6f29055ad1/volumes" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.496667 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.511640 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.525319 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.526984 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.531010 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.531529 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.532209 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.532979 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.533052 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.533438 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-f9tl2" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.533644 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.536112 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.538236 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.583841 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584214 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584249 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584300 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x92h\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-kube-api-access-6x92h\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584338 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584372 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584416 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584479 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584514 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-config-data\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584638 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.584796 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686331 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686371 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-config-data\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686451 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686498 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686520 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686569 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686586 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686608 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x92h\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-kube-api-access-6x92h\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686690 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686707 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.686740 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.687142 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.687859 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.688067 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.689321 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.689435 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-config-data\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.690533 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.692284 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.692752 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.693164 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.695463 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.695788 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/931f8fb7a5c01cf1cb7d2760c8569b187e78d733dd4740fa54b0a4a2b50fa59c/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.706569 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x92h\" (UniqueName: \"kubernetes.io/projected/7a0cd471-ab8f-4de8-bd7f-0392d7d7f903-kube-api-access-6x92h\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.727362 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5858d03f-30af-4b3b-8ca1-f1d04993b325\") pod \"rabbitmq-server-0\" (UID: \"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903\") " pod="openstack/rabbitmq-server-0" Mar 10 23:15:35 crc kubenswrapper[4919]: I0310 23:15:35.914296 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 23:15:36 crc kubenswrapper[4919]: I0310 23:15:36.074087 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 23:15:36 crc kubenswrapper[4919]: I0310 23:15:36.140296 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2101a481-5a05-4fe5-ae97-d3dd73ee8153","Type":"ContainerStarted","Data":"17d60c816027ebc092068674bbdc61f9a7aed4d639548b7dd6c87168dc7d7b6d"} Mar 10 23:15:36 crc kubenswrapper[4919]: I0310 23:15:36.190874 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 23:15:36 crc kubenswrapper[4919]: W0310 23:15:36.198247 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a0cd471_ab8f_4de8_bd7f_0392d7d7f903.slice/crio-f078ab72fe775f90a7922ea00dc465d53d2d8a1be1dee84a13b01f78ba3f856e WatchSource:0}: Error finding container f078ab72fe775f90a7922ea00dc465d53d2d8a1be1dee84a13b01f78ba3f856e: Status 404 returned error can't find the container with id f078ab72fe775f90a7922ea00dc465d53d2d8a1be1dee84a13b01f78ba3f856e Mar 10 23:15:37 crc kubenswrapper[4919]: I0310 23:15:37.158155 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903","Type":"ContainerStarted","Data":"f078ab72fe775f90a7922ea00dc465d53d2d8a1be1dee84a13b01f78ba3f856e"} Mar 10 23:15:37 crc kubenswrapper[4919]: I0310 23:15:37.500762 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b3447d-7136-4c2b-bd66-18e26e7a157e" path="/var/lib/kubelet/pods/d5b3447d-7136-4c2b-bd66-18e26e7a157e/volumes" Mar 10 23:15:38 crc kubenswrapper[4919]: I0310 23:15:38.173569 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903","Type":"ContainerStarted","Data":"d8a021704141015c4435371209f6ad49de26690b72ce7965a5f825ec79fb4028"} Mar 10 23:15:38 crc kubenswrapper[4919]: I0310 23:15:38.175558 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2101a481-5a05-4fe5-ae97-d3dd73ee8153","Type":"ContainerStarted","Data":"347ee17b88b3fc6474fdc5493646e5663b3dd8175e6e4cc4d571afbda0b5d3ff"} Mar 10 23:15:46 crc kubenswrapper[4919]: I0310 23:15:46.716251 4919 scope.go:117] "RemoveContainer" containerID="6dc61187129bc29ae275b3b18ecf2eb7ebc2a48a7b100292f01257a83ba12918" Mar 10 23:15:48 crc kubenswrapper[4919]: I0310 23:15:48.480152 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:15:48 crc kubenswrapper[4919]: E0310 23:15:48.481250 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.161894 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553076-zgksb"] Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.164448 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553076-zgksb" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.168132 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.168707 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.169021 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.169608 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553076-zgksb"] Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.328343 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szfw5\" (UniqueName: \"kubernetes.io/projected/1b20f686-1c1a-43cb-af86-20a0670592b9-kube-api-access-szfw5\") pod \"auto-csr-approver-29553076-zgksb\" (UID: \"1b20f686-1c1a-43cb-af86-20a0670592b9\") " pod="openshift-infra/auto-csr-approver-29553076-zgksb" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.431963 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szfw5\" (UniqueName: \"kubernetes.io/projected/1b20f686-1c1a-43cb-af86-20a0670592b9-kube-api-access-szfw5\") pod \"auto-csr-approver-29553076-zgksb\" (UID: \"1b20f686-1c1a-43cb-af86-20a0670592b9\") " pod="openshift-infra/auto-csr-approver-29553076-zgksb" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.463328 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szfw5\" (UniqueName: \"kubernetes.io/projected/1b20f686-1c1a-43cb-af86-20a0670592b9-kube-api-access-szfw5\") pod \"auto-csr-approver-29553076-zgksb\" (UID: \"1b20f686-1c1a-43cb-af86-20a0670592b9\") " pod="openshift-infra/auto-csr-approver-29553076-zgksb" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.480962 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:16:00 crc kubenswrapper[4919]: E0310 23:16:00.481533 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.504433 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553076-zgksb" Mar 10 23:16:00 crc kubenswrapper[4919]: W0310 23:16:00.837878 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b20f686_1c1a_43cb_af86_20a0670592b9.slice/crio-d6d180041d268af49979ca279922055d6fa01b851964642e692aedf414a3ccbf WatchSource:0}: Error finding container d6d180041d268af49979ca279922055d6fa01b851964642e692aedf414a3ccbf: Status 404 returned error can't find the container with id d6d180041d268af49979ca279922055d6fa01b851964642e692aedf414a3ccbf Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.840281 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553076-zgksb"] Mar 10 23:16:00 crc kubenswrapper[4919]: I0310 23:16:00.847116 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 23:16:01 crc kubenswrapper[4919]: I0310 23:16:01.388785 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553076-zgksb" event={"ID":"1b20f686-1c1a-43cb-af86-20a0670592b9","Type":"ContainerStarted","Data":"d6d180041d268af49979ca279922055d6fa01b851964642e692aedf414a3ccbf"} Mar 10 23:16:02 crc kubenswrapper[4919]: I0310 23:16:02.423045 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553076-zgksb" event={"ID":"1b20f686-1c1a-43cb-af86-20a0670592b9","Type":"ContainerStarted","Data":"37ac229fb614b71e9a5fae14609fd5ae7ba4f80af9252601f9bdbfaa40d3b960"} Mar 10 23:16:02 crc kubenswrapper[4919]: I0310 23:16:02.453333 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553076-zgksb" podStartSLOduration=1.390243654 podStartE2EDuration="2.453302012s" podCreationTimestamp="2026-03-10 23:16:00 +0000 UTC" firstStartedPulling="2026-03-10 23:16:00.846864348 +0000 UTC m=+5148.088744956" lastFinishedPulling="2026-03-10 23:16:01.909922706 +0000 UTC m=+5149.151803314" observedRunningTime="2026-03-10 23:16:02.446484366 +0000 UTC m=+5149.688365014" watchObservedRunningTime="2026-03-10 23:16:02.453302012 +0000 UTC m=+5149.695182660" Mar 10 23:16:03 crc kubenswrapper[4919]: I0310 23:16:03.432264 4919 generic.go:334] "Generic (PLEG): container finished" podID="1b20f686-1c1a-43cb-af86-20a0670592b9" containerID="37ac229fb614b71e9a5fae14609fd5ae7ba4f80af9252601f9bdbfaa40d3b960" exitCode=0 Mar 10 23:16:03 crc kubenswrapper[4919]: I0310 23:16:03.432438 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553076-zgksb" event={"ID":"1b20f686-1c1a-43cb-af86-20a0670592b9","Type":"ContainerDied","Data":"37ac229fb614b71e9a5fae14609fd5ae7ba4f80af9252601f9bdbfaa40d3b960"} Mar 10 23:16:04 crc kubenswrapper[4919]: I0310 23:16:04.735605 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553076-zgksb" Mar 10 23:16:04 crc kubenswrapper[4919]: I0310 23:16:04.907925 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szfw5\" (UniqueName: \"kubernetes.io/projected/1b20f686-1c1a-43cb-af86-20a0670592b9-kube-api-access-szfw5\") pod \"1b20f686-1c1a-43cb-af86-20a0670592b9\" (UID: \"1b20f686-1c1a-43cb-af86-20a0670592b9\") " Mar 10 23:16:04 crc kubenswrapper[4919]: I0310 23:16:04.913730 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b20f686-1c1a-43cb-af86-20a0670592b9-kube-api-access-szfw5" (OuterVolumeSpecName: "kube-api-access-szfw5") pod "1b20f686-1c1a-43cb-af86-20a0670592b9" (UID: "1b20f686-1c1a-43cb-af86-20a0670592b9"). InnerVolumeSpecName "kube-api-access-szfw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:16:05 crc kubenswrapper[4919]: I0310 23:16:05.012757 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szfw5\" (UniqueName: \"kubernetes.io/projected/1b20f686-1c1a-43cb-af86-20a0670592b9-kube-api-access-szfw5\") on node \"crc\" DevicePath \"\"" Mar 10 23:16:05 crc kubenswrapper[4919]: I0310 23:16:05.452893 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553076-zgksb" event={"ID":"1b20f686-1c1a-43cb-af86-20a0670592b9","Type":"ContainerDied","Data":"d6d180041d268af49979ca279922055d6fa01b851964642e692aedf414a3ccbf"} Mar 10 23:16:05 crc kubenswrapper[4919]: I0310 23:16:05.452952 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6d180041d268af49979ca279922055d6fa01b851964642e692aedf414a3ccbf" Mar 10 23:16:05 crc kubenswrapper[4919]: I0310 23:16:05.453042 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553076-zgksb" Mar 10 23:16:05 crc kubenswrapper[4919]: I0310 23:16:05.512476 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553070-zsclc"] Mar 10 23:16:05 crc kubenswrapper[4919]: I0310 23:16:05.517825 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553070-zsclc"] Mar 10 23:16:07 crc kubenswrapper[4919]: I0310 23:16:07.492979 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb1dc8b-5874-454e-b465-9b65f7510343" path="/var/lib/kubelet/pods/9bb1dc8b-5874-454e-b465-9b65f7510343/volumes" Mar 10 23:16:11 crc kubenswrapper[4919]: I0310 23:16:11.509350 4919 generic.go:334] "Generic (PLEG): container finished" podID="2101a481-5a05-4fe5-ae97-d3dd73ee8153" containerID="347ee17b88b3fc6474fdc5493646e5663b3dd8175e6e4cc4d571afbda0b5d3ff" exitCode=0 Mar 10 23:16:11 crc kubenswrapper[4919]: I0310 23:16:11.509459 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2101a481-5a05-4fe5-ae97-d3dd73ee8153","Type":"ContainerDied","Data":"347ee17b88b3fc6474fdc5493646e5663b3dd8175e6e4cc4d571afbda0b5d3ff"} Mar 10 23:16:11 crc kubenswrapper[4919]: I0310 23:16:11.512275 4919 generic.go:334] "Generic (PLEG): container finished" podID="7a0cd471-ab8f-4de8-bd7f-0392d7d7f903" containerID="d8a021704141015c4435371209f6ad49de26690b72ce7965a5f825ec79fb4028" exitCode=0 Mar 10 23:16:11 crc kubenswrapper[4919]: I0310 23:16:11.512314 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903","Type":"ContainerDied","Data":"d8a021704141015c4435371209f6ad49de26690b72ce7965a5f825ec79fb4028"} Mar 10 23:16:12 crc kubenswrapper[4919]: I0310 23:16:12.480260 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:16:12 crc kubenswrapper[4919]: E0310 23:16:12.480795 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:16:12 crc kubenswrapper[4919]: I0310 23:16:12.523160 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2101a481-5a05-4fe5-ae97-d3dd73ee8153","Type":"ContainerStarted","Data":"e105d500e3224ed6b88b3756e267b97b860c531b68526ddd755098eaa3ab2ebe"} Mar 10 23:16:12 crc kubenswrapper[4919]: I0310 23:16:12.523484 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:16:12 crc kubenswrapper[4919]: I0310 23:16:12.525886 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a0cd471-ab8f-4de8-bd7f-0392d7d7f903","Type":"ContainerStarted","Data":"5ef08409bf236c0c0be23645d749648f6475da254ddaba39268929e3987ced02"} Mar 10 23:16:12 crc kubenswrapper[4919]: I0310 23:16:12.526163 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 23:16:12 crc kubenswrapper[4919]: I0310 23:16:12.549387 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.549358214 podStartE2EDuration="37.549358214s" podCreationTimestamp="2026-03-10 23:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:16:12.549159879 +0000 UTC m=+5159.791040507" watchObservedRunningTime="2026-03-10 23:16:12.549358214 +0000 UTC m=+5159.791238832" Mar 10 23:16:23 crc kubenswrapper[4919]: I0310 23:16:23.485184 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:16:23 crc kubenswrapper[4919]: E0310 23:16:23.486430 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:16:25 crc kubenswrapper[4919]: I0310 23:16:25.534835 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 23:16:25 crc kubenswrapper[4919]: I0310 23:16:25.568690 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.568670763 podStartE2EDuration="50.568670763s" podCreationTimestamp="2026-03-10 23:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:16:12.579091054 +0000 UTC m=+5159.820971672" watchObservedRunningTime="2026-03-10 23:16:25.568670763 +0000 UTC m=+5172.810551381" Mar 10 23:16:25 crc kubenswrapper[4919]: I0310 23:16:25.917553 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.405660 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 23:16:30 crc kubenswrapper[4919]: E0310 23:16:30.407031 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b20f686-1c1a-43cb-af86-20a0670592b9" containerName="oc" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.407055 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b20f686-1c1a-43cb-af86-20a0670592b9" containerName="oc" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.407307 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b20f686-1c1a-43cb-af86-20a0670592b9" containerName="oc" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.408099 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.411435 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dx7r9" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.419497 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.553175 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmhw2\" (UniqueName: \"kubernetes.io/projected/f741de46-11cd-4282-8bd4-61b098a0ef7f-kube-api-access-lmhw2\") pod \"mariadb-client\" (UID: \"f741de46-11cd-4282-8bd4-61b098a0ef7f\") " pod="openstack/mariadb-client" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.655014 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmhw2\" (UniqueName: \"kubernetes.io/projected/f741de46-11cd-4282-8bd4-61b098a0ef7f-kube-api-access-lmhw2\") pod \"mariadb-client\" (UID: \"f741de46-11cd-4282-8bd4-61b098a0ef7f\") " pod="openstack/mariadb-client" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.684361 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmhw2\" (UniqueName: \"kubernetes.io/projected/f741de46-11cd-4282-8bd4-61b098a0ef7f-kube-api-access-lmhw2\") pod \"mariadb-client\" (UID: \"f741de46-11cd-4282-8bd4-61b098a0ef7f\") " pod="openstack/mariadb-client" Mar 10 23:16:30 crc kubenswrapper[4919]: I0310 23:16:30.768515 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:16:31 crc kubenswrapper[4919]: W0310 23:16:31.119733 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf741de46_11cd_4282_8bd4_61b098a0ef7f.slice/crio-0e6684b575939ab7d9cd6618f4d9b95480e73fe21a77c05ba6cc4164eddb953d WatchSource:0}: Error finding container 0e6684b575939ab7d9cd6618f4d9b95480e73fe21a77c05ba6cc4164eddb953d: Status 404 returned error can't find the container with id 0e6684b575939ab7d9cd6618f4d9b95480e73fe21a77c05ba6cc4164eddb953d Mar 10 23:16:31 crc kubenswrapper[4919]: I0310 23:16:31.121675 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:16:31 crc kubenswrapper[4919]: I0310 23:16:31.681472 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f741de46-11cd-4282-8bd4-61b098a0ef7f","Type":"ContainerStarted","Data":"0e6684b575939ab7d9cd6618f4d9b95480e73fe21a77c05ba6cc4164eddb953d"} Mar 10 23:16:37 crc kubenswrapper[4919]: I0310 23:16:37.480153 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:16:37 crc kubenswrapper[4919]: E0310 23:16:37.481212 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:16:37 crc kubenswrapper[4919]: I0310 23:16:37.725272 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f741de46-11cd-4282-8bd4-61b098a0ef7f","Type":"ContainerStarted","Data":"24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4"} Mar 10 23:16:37 crc kubenswrapper[4919]: I0310 23:16:37.745719 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.591742278 podStartE2EDuration="7.74570291s" podCreationTimestamp="2026-03-10 23:16:30 +0000 UTC" firstStartedPulling="2026-03-10 23:16:31.122219984 +0000 UTC m=+5178.364100602" lastFinishedPulling="2026-03-10 23:16:37.276180626 +0000 UTC m=+5184.518061234" observedRunningTime="2026-03-10 23:16:37.740672654 +0000 UTC m=+5184.982553332" watchObservedRunningTime="2026-03-10 23:16:37.74570291 +0000 UTC m=+5184.987583518" Mar 10 23:16:46 crc kubenswrapper[4919]: I0310 23:16:46.841417 4919 scope.go:117] "RemoveContainer" containerID="8916826393f6f2a4e50449a00083bea961de7dafcc8cdd0c59556731ca7a7f5b" Mar 10 23:16:49 crc kubenswrapper[4919]: I0310 23:16:49.831063 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:16:49 crc kubenswrapper[4919]: I0310 23:16:49.831865 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="f741de46-11cd-4282-8bd4-61b098a0ef7f" containerName="mariadb-client" containerID="cri-o://24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4" gracePeriod=30 Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.403834 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.479697 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:16:50 crc kubenswrapper[4919]: E0310 23:16:50.480227 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.591924 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmhw2\" (UniqueName: \"kubernetes.io/projected/f741de46-11cd-4282-8bd4-61b098a0ef7f-kube-api-access-lmhw2\") pod \"f741de46-11cd-4282-8bd4-61b098a0ef7f\" (UID: \"f741de46-11cd-4282-8bd4-61b098a0ef7f\") " Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.602880 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f741de46-11cd-4282-8bd4-61b098a0ef7f-kube-api-access-lmhw2" (OuterVolumeSpecName: "kube-api-access-lmhw2") pod "f741de46-11cd-4282-8bd4-61b098a0ef7f" (UID: "f741de46-11cd-4282-8bd4-61b098a0ef7f"). InnerVolumeSpecName "kube-api-access-lmhw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.696386 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmhw2\" (UniqueName: \"kubernetes.io/projected/f741de46-11cd-4282-8bd4-61b098a0ef7f-kube-api-access-lmhw2\") on node \"crc\" DevicePath \"\"" Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.830848 4919 generic.go:334] "Generic (PLEG): container finished" podID="f741de46-11cd-4282-8bd4-61b098a0ef7f" containerID="24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4" exitCode=143 Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.830897 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.830927 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f741de46-11cd-4282-8bd4-61b098a0ef7f","Type":"ContainerDied","Data":"24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4"} Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.830999 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f741de46-11cd-4282-8bd4-61b098a0ef7f","Type":"ContainerDied","Data":"0e6684b575939ab7d9cd6618f4d9b95480e73fe21a77c05ba6cc4164eddb953d"} Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.831042 4919 scope.go:117] "RemoveContainer" containerID="24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4" Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.872031 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.874493 4919 scope.go:117] "RemoveContainer" containerID="24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4" Mar 10 23:16:50 crc kubenswrapper[4919]: E0310 23:16:50.876648 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4\": container with ID starting with 24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4 not found: ID does not exist" containerID="24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4" Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.876713 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4"} err="failed to get container status \"24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4\": rpc error: code = NotFound desc = could not find container \"24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4\": container with ID starting with 24cc1ca4ed26031c17fa227aee3919cb6f8a87a82bbebc33dd91095548e7f9d4 not found: ID does not exist" Mar 10 23:16:50 crc kubenswrapper[4919]: I0310 23:16:50.884056 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:16:51 crc kubenswrapper[4919]: I0310 23:16:51.496139 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f741de46-11cd-4282-8bd4-61b098a0ef7f" path="/var/lib/kubelet/pods/f741de46-11cd-4282-8bd4-61b098a0ef7f/volumes" Mar 10 23:17:02 crc kubenswrapper[4919]: I0310 23:17:02.480763 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:17:02 crc kubenswrapper[4919]: E0310 23:17:02.481386 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.439434 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jn8rw"] Mar 10 23:17:08 crc kubenswrapper[4919]: E0310 23:17:08.440628 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f741de46-11cd-4282-8bd4-61b098a0ef7f" containerName="mariadb-client" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.440648 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="f741de46-11cd-4282-8bd4-61b098a0ef7f" containerName="mariadb-client" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.440827 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="f741de46-11cd-4282-8bd4-61b098a0ef7f" containerName="mariadb-client" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.442432 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.453366 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jn8rw"] Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.527684 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-catalog-content\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.527844 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-utilities\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.527974 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhg8h\" (UniqueName: \"kubernetes.io/projected/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-kube-api-access-rhg8h\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.628986 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-utilities\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.629093 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhg8h\" (UniqueName: \"kubernetes.io/projected/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-kube-api-access-rhg8h\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.629183 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-catalog-content\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.629836 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-utilities\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.629915 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-catalog-content\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.648428 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhg8h\" (UniqueName: \"kubernetes.io/projected/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-kube-api-access-rhg8h\") pod \"redhat-operators-jn8rw\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:08 crc kubenswrapper[4919]: I0310 23:17:08.774709 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:09 crc kubenswrapper[4919]: I0310 23:17:09.225343 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jn8rw"] Mar 10 23:17:10 crc kubenswrapper[4919]: I0310 23:17:10.009131 4919 generic.go:334] "Generic (PLEG): container finished" podID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerID="77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c" exitCode=0 Mar 10 23:17:10 crc kubenswrapper[4919]: I0310 23:17:10.009469 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn8rw" event={"ID":"8afc2e79-4dd9-4124-8cb4-aec6cb675a00","Type":"ContainerDied","Data":"77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c"} Mar 10 23:17:10 crc kubenswrapper[4919]: I0310 23:17:10.009708 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn8rw" event={"ID":"8afc2e79-4dd9-4124-8cb4-aec6cb675a00","Type":"ContainerStarted","Data":"8ca5ca9b04e0132a6555fb13daa7b7bf873593429a268fef03e36bfe2fe22e5a"} Mar 10 23:17:11 crc kubenswrapper[4919]: I0310 23:17:11.018542 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn8rw" event={"ID":"8afc2e79-4dd9-4124-8cb4-aec6cb675a00","Type":"ContainerStarted","Data":"a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d"} Mar 10 23:17:12 crc kubenswrapper[4919]: I0310 23:17:12.029264 4919 generic.go:334] "Generic (PLEG): container finished" podID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerID="a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d" exitCode=0 Mar 10 23:17:12 crc kubenswrapper[4919]: I0310 23:17:12.029344 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn8rw" event={"ID":"8afc2e79-4dd9-4124-8cb4-aec6cb675a00","Type":"ContainerDied","Data":"a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d"} Mar 10 23:17:13 crc kubenswrapper[4919]: I0310 23:17:13.039313 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn8rw" event={"ID":"8afc2e79-4dd9-4124-8cb4-aec6cb675a00","Type":"ContainerStarted","Data":"f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb"} Mar 10 23:17:13 crc kubenswrapper[4919]: I0310 23:17:13.069464 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jn8rw" podStartSLOduration=2.431811799 podStartE2EDuration="5.069439073s" podCreationTimestamp="2026-03-10 23:17:08 +0000 UTC" firstStartedPulling="2026-03-10 23:17:10.014998261 +0000 UTC m=+5217.256878879" lastFinishedPulling="2026-03-10 23:17:12.652625535 +0000 UTC m=+5219.894506153" observedRunningTime="2026-03-10 23:17:13.06164121 +0000 UTC m=+5220.303521818" watchObservedRunningTime="2026-03-10 23:17:13.069439073 +0000 UTC m=+5220.311319711" Mar 10 23:17:16 crc kubenswrapper[4919]: I0310 23:17:16.480336 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:17:16 crc kubenswrapper[4919]: E0310 23:17:16.480950 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:17:18 crc kubenswrapper[4919]: I0310 23:17:18.776254 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:18 crc kubenswrapper[4919]: I0310 23:17:18.776676 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:19 crc kubenswrapper[4919]: I0310 23:17:19.835558 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jn8rw" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="registry-server" probeResult="failure" output=< Mar 10 23:17:19 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 23:17:19 crc kubenswrapper[4919]: > Mar 10 23:17:27 crc kubenswrapper[4919]: I0310 23:17:27.479921 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:17:27 crc kubenswrapper[4919]: E0310 23:17:27.484477 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:17:28 crc kubenswrapper[4919]: I0310 23:17:28.862964 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:28 crc kubenswrapper[4919]: I0310 23:17:28.931612 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:29 crc kubenswrapper[4919]: I0310 23:17:29.108591 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jn8rw"] Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.183918 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jn8rw" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="registry-server" containerID="cri-o://f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb" gracePeriod=2 Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.636181 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.781227 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-catalog-content\") pod \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.781371 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhg8h\" (UniqueName: \"kubernetes.io/projected/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-kube-api-access-rhg8h\") pod \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.781434 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-utilities\") pod \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\" (UID: \"8afc2e79-4dd9-4124-8cb4-aec6cb675a00\") " Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.782345 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-utilities" (OuterVolumeSpecName: "utilities") pod "8afc2e79-4dd9-4124-8cb4-aec6cb675a00" (UID: "8afc2e79-4dd9-4124-8cb4-aec6cb675a00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.789099 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-kube-api-access-rhg8h" (OuterVolumeSpecName: "kube-api-access-rhg8h") pod "8afc2e79-4dd9-4124-8cb4-aec6cb675a00" (UID: "8afc2e79-4dd9-4124-8cb4-aec6cb675a00"). InnerVolumeSpecName "kube-api-access-rhg8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.883747 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhg8h\" (UniqueName: \"kubernetes.io/projected/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-kube-api-access-rhg8h\") on node \"crc\" DevicePath \"\"" Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.883785 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.923981 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8afc2e79-4dd9-4124-8cb4-aec6cb675a00" (UID: "8afc2e79-4dd9-4124-8cb4-aec6cb675a00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:17:30 crc kubenswrapper[4919]: I0310 23:17:30.985546 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8afc2e79-4dd9-4124-8cb4-aec6cb675a00-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.192915 4919 generic.go:334] "Generic (PLEG): container finished" podID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerID="f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb" exitCode=0 Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.192973 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn8rw" event={"ID":"8afc2e79-4dd9-4124-8cb4-aec6cb675a00","Type":"ContainerDied","Data":"f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb"} Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.193005 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jn8rw" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.193039 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn8rw" event={"ID":"8afc2e79-4dd9-4124-8cb4-aec6cb675a00","Type":"ContainerDied","Data":"8ca5ca9b04e0132a6555fb13daa7b7bf873593429a268fef03e36bfe2fe22e5a"} Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.193099 4919 scope.go:117] "RemoveContainer" containerID="f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.224070 4919 scope.go:117] "RemoveContainer" containerID="a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.230081 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jn8rw"] Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.239321 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jn8rw"] Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.247602 4919 scope.go:117] "RemoveContainer" containerID="77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.299769 4919 scope.go:117] "RemoveContainer" containerID="f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb" Mar 10 23:17:31 crc kubenswrapper[4919]: E0310 23:17:31.300711 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb\": container with ID starting with f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb not found: ID does not exist" containerID="f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.300752 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb"} err="failed to get container status \"f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb\": rpc error: code = NotFound desc = could not find container \"f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb\": container with ID starting with f0ad44113eb43ca4690df6ea75ac35ac11c115d381f103df25b69e75409cf8fb not found: ID does not exist" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.300783 4919 scope.go:117] "RemoveContainer" containerID="a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d" Mar 10 23:17:31 crc kubenswrapper[4919]: E0310 23:17:31.301479 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d\": container with ID starting with a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d not found: ID does not exist" containerID="a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.301513 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d"} err="failed to get container status \"a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d\": rpc error: code = NotFound desc = could not find container \"a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d\": container with ID starting with a253656977bd6c91dc1b4e20fcdc4be5d801246dcf41ddbbd6d02e91f752d62d not found: ID does not exist" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.301537 4919 scope.go:117] "RemoveContainer" containerID="77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c" Mar 10 23:17:31 crc kubenswrapper[4919]: E0310 23:17:31.301973 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c\": container with ID starting with 77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c not found: ID does not exist" containerID="77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.302000 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c"} err="failed to get container status \"77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c\": rpc error: code = NotFound desc = could not find container \"77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c\": container with ID starting with 77bd2658506cd8b67de048f46f06fa07eaa7ec3b3ead1328ccc6a8538e98117c not found: ID does not exist" Mar 10 23:17:31 crc kubenswrapper[4919]: I0310 23:17:31.490883 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" path="/var/lib/kubelet/pods/8afc2e79-4dd9-4124-8cb4-aec6cb675a00/volumes" Mar 10 23:17:42 crc kubenswrapper[4919]: I0310 23:17:42.480017 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:17:43 crc kubenswrapper[4919]: I0310 23:17:43.303580 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"5ed85a234e315a1e8c0b68df55722ea097e0f8687391361bc1b4250e4cb84b0a"} Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.147210 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553078-ndkn5"] Mar 10 23:18:00 crc kubenswrapper[4919]: E0310 23:18:00.148497 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="extract-utilities" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.148524 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="extract-utilities" Mar 10 23:18:00 crc kubenswrapper[4919]: E0310 23:18:00.148572 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="extract-content" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.148584 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="extract-content" Mar 10 23:18:00 crc kubenswrapper[4919]: E0310 23:18:00.148606 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="registry-server" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.148618 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="registry-server" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.148888 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8afc2e79-4dd9-4124-8cb4-aec6cb675a00" containerName="registry-server" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.149774 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553078-ndkn5" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.153759 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.154602 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.155519 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.163052 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553078-ndkn5"] Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.301629 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t87c7\" (UniqueName: \"kubernetes.io/projected/ae93b879-4c13-4274-8d35-9ab108c2922d-kube-api-access-t87c7\") pod \"auto-csr-approver-29553078-ndkn5\" (UID: \"ae93b879-4c13-4274-8d35-9ab108c2922d\") " pod="openshift-infra/auto-csr-approver-29553078-ndkn5" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.403084 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t87c7\" (UniqueName: \"kubernetes.io/projected/ae93b879-4c13-4274-8d35-9ab108c2922d-kube-api-access-t87c7\") pod \"auto-csr-approver-29553078-ndkn5\" (UID: \"ae93b879-4c13-4274-8d35-9ab108c2922d\") " pod="openshift-infra/auto-csr-approver-29553078-ndkn5" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.426510 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t87c7\" (UniqueName: \"kubernetes.io/projected/ae93b879-4c13-4274-8d35-9ab108c2922d-kube-api-access-t87c7\") pod \"auto-csr-approver-29553078-ndkn5\" (UID: \"ae93b879-4c13-4274-8d35-9ab108c2922d\") " pod="openshift-infra/auto-csr-approver-29553078-ndkn5" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.481174 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553078-ndkn5" Mar 10 23:18:00 crc kubenswrapper[4919]: I0310 23:18:00.942547 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553078-ndkn5"] Mar 10 23:18:01 crc kubenswrapper[4919]: I0310 23:18:01.487981 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553078-ndkn5" event={"ID":"ae93b879-4c13-4274-8d35-9ab108c2922d","Type":"ContainerStarted","Data":"62afcf50e6e5087254dfe2fd56c5afefa9dc5e387593af9cf1872ad8c927cfdd"} Mar 10 23:18:03 crc kubenswrapper[4919]: I0310 23:18:03.498530 4919 generic.go:334] "Generic (PLEG): container finished" podID="ae93b879-4c13-4274-8d35-9ab108c2922d" containerID="e0cd0c839d2d67aaec8e7382039914e169ea2bd9742a049061016e85c02e01ed" exitCode=0 Mar 10 23:18:03 crc kubenswrapper[4919]: I0310 23:18:03.498642 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553078-ndkn5" event={"ID":"ae93b879-4c13-4274-8d35-9ab108c2922d","Type":"ContainerDied","Data":"e0cd0c839d2d67aaec8e7382039914e169ea2bd9742a049061016e85c02e01ed"} Mar 10 23:18:04 crc kubenswrapper[4919]: I0310 23:18:04.831660 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553078-ndkn5" Mar 10 23:18:04 crc kubenswrapper[4919]: I0310 23:18:04.976491 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t87c7\" (UniqueName: \"kubernetes.io/projected/ae93b879-4c13-4274-8d35-9ab108c2922d-kube-api-access-t87c7\") pod \"ae93b879-4c13-4274-8d35-9ab108c2922d\" (UID: \"ae93b879-4c13-4274-8d35-9ab108c2922d\") " Mar 10 23:18:04 crc kubenswrapper[4919]: I0310 23:18:04.983730 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae93b879-4c13-4274-8d35-9ab108c2922d-kube-api-access-t87c7" (OuterVolumeSpecName: "kube-api-access-t87c7") pod "ae93b879-4c13-4274-8d35-9ab108c2922d" (UID: "ae93b879-4c13-4274-8d35-9ab108c2922d"). InnerVolumeSpecName "kube-api-access-t87c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:18:05 crc kubenswrapper[4919]: I0310 23:18:05.078538 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t87c7\" (UniqueName: \"kubernetes.io/projected/ae93b879-4c13-4274-8d35-9ab108c2922d-kube-api-access-t87c7\") on node \"crc\" DevicePath \"\"" Mar 10 23:18:05 crc kubenswrapper[4919]: I0310 23:18:05.517410 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553078-ndkn5" event={"ID":"ae93b879-4c13-4274-8d35-9ab108c2922d","Type":"ContainerDied","Data":"62afcf50e6e5087254dfe2fd56c5afefa9dc5e387593af9cf1872ad8c927cfdd"} Mar 10 23:18:05 crc kubenswrapper[4919]: I0310 23:18:05.517449 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553078-ndkn5" Mar 10 23:18:05 crc kubenswrapper[4919]: I0310 23:18:05.517457 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62afcf50e6e5087254dfe2fd56c5afefa9dc5e387593af9cf1872ad8c927cfdd" Mar 10 23:18:05 crc kubenswrapper[4919]: I0310 23:18:05.908663 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553072-zq5sf"] Mar 10 23:18:05 crc kubenswrapper[4919]: I0310 23:18:05.920873 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553072-zq5sf"] Mar 10 23:18:07 crc kubenswrapper[4919]: I0310 23:18:07.496859 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a67e1f39-f53e-4fdc-a7df-612e0f16da3d" path="/var/lib/kubelet/pods/a67e1f39-f53e-4fdc-a7df-612e0f16da3d/volumes" Mar 10 23:18:46 crc kubenswrapper[4919]: I0310 23:18:46.947746 4919 scope.go:117] "RemoveContainer" containerID="a60687c7d2ead121ac4cc1ccf2d34efe00ca55b38145c8900e1ae0d4742a56ce" Mar 10 23:18:46 crc kubenswrapper[4919]: I0310 23:18:46.985121 4919 scope.go:117] "RemoveContainer" containerID="66e3d69b634938b442f517d1b95772cfc4d8e1093abee852a01f740f4406a872" Mar 10 23:19:59 crc kubenswrapper[4919]: I0310 23:19:59.175524 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:19:59 crc kubenswrapper[4919]: I0310 23:19:59.176271 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.159828 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553080-5z76m"] Mar 10 23:20:00 crc kubenswrapper[4919]: E0310 23:20:00.160363 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae93b879-4c13-4274-8d35-9ab108c2922d" containerName="oc" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.160381 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae93b879-4c13-4274-8d35-9ab108c2922d" containerName="oc" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.160636 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae93b879-4c13-4274-8d35-9ab108c2922d" containerName="oc" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.161260 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553080-5z76m" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.164238 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.164841 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.165385 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.193979 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553080-5z76m"] Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.283459 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9p9q\" (UniqueName: \"kubernetes.io/projected/a3adab12-0a85-47f8-a8f0-ccbc0b4b275d-kube-api-access-b9p9q\") pod \"auto-csr-approver-29553080-5z76m\" (UID: \"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d\") " pod="openshift-infra/auto-csr-approver-29553080-5z76m" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.384897 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9p9q\" (UniqueName: \"kubernetes.io/projected/a3adab12-0a85-47f8-a8f0-ccbc0b4b275d-kube-api-access-b9p9q\") pod \"auto-csr-approver-29553080-5z76m\" (UID: \"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d\") " pod="openshift-infra/auto-csr-approver-29553080-5z76m" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.414753 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9p9q\" (UniqueName: \"kubernetes.io/projected/a3adab12-0a85-47f8-a8f0-ccbc0b4b275d-kube-api-access-b9p9q\") pod \"auto-csr-approver-29553080-5z76m\" (UID: \"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d\") " pod="openshift-infra/auto-csr-approver-29553080-5z76m" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.498191 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553080-5z76m" Mar 10 23:20:00 crc kubenswrapper[4919]: I0310 23:20:00.812136 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553080-5z76m"] Mar 10 23:20:01 crc kubenswrapper[4919]: I0310 23:20:01.517755 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553080-5z76m" event={"ID":"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d","Type":"ContainerStarted","Data":"89b628b3751fc96ae8e1a337b8287364881deadae709a414f3a01d7e9404c4fb"} Mar 10 23:20:02 crc kubenswrapper[4919]: I0310 23:20:02.529596 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553080-5z76m" event={"ID":"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d","Type":"ContainerStarted","Data":"eafabcee6bed67be05d1a878a651478ba03e70bde5e961897e86648fe25aaa3d"} Mar 10 23:20:02 crc kubenswrapper[4919]: I0310 23:20:02.548357 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553080-5z76m" podStartSLOduration=1.261211657 podStartE2EDuration="2.54833436s" podCreationTimestamp="2026-03-10 23:20:00 +0000 UTC" firstStartedPulling="2026-03-10 23:20:00.826287775 +0000 UTC m=+5388.068168383" lastFinishedPulling="2026-03-10 23:20:02.113410478 +0000 UTC m=+5389.355291086" observedRunningTime="2026-03-10 23:20:02.542894752 +0000 UTC m=+5389.784775370" watchObservedRunningTime="2026-03-10 23:20:02.54833436 +0000 UTC m=+5389.790214968" Mar 10 23:20:03 crc kubenswrapper[4919]: I0310 23:20:03.540989 4919 generic.go:334] "Generic (PLEG): container finished" podID="a3adab12-0a85-47f8-a8f0-ccbc0b4b275d" containerID="eafabcee6bed67be05d1a878a651478ba03e70bde5e961897e86648fe25aaa3d" exitCode=0 Mar 10 23:20:03 crc kubenswrapper[4919]: I0310 23:20:03.541055 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553080-5z76m" event={"ID":"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d","Type":"ContainerDied","Data":"eafabcee6bed67be05d1a878a651478ba03e70bde5e961897e86648fe25aaa3d"} Mar 10 23:20:04 crc kubenswrapper[4919]: I0310 23:20:04.858068 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553080-5z76m" Mar 10 23:20:04 crc kubenswrapper[4919]: I0310 23:20:04.960346 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9p9q\" (UniqueName: \"kubernetes.io/projected/a3adab12-0a85-47f8-a8f0-ccbc0b4b275d-kube-api-access-b9p9q\") pod \"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d\" (UID: \"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d\") " Mar 10 23:20:04 crc kubenswrapper[4919]: I0310 23:20:04.967903 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3adab12-0a85-47f8-a8f0-ccbc0b4b275d-kube-api-access-b9p9q" (OuterVolumeSpecName: "kube-api-access-b9p9q") pod "a3adab12-0a85-47f8-a8f0-ccbc0b4b275d" (UID: "a3adab12-0a85-47f8-a8f0-ccbc0b4b275d"). InnerVolumeSpecName "kube-api-access-b9p9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:20:05 crc kubenswrapper[4919]: I0310 23:20:05.062317 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9p9q\" (UniqueName: \"kubernetes.io/projected/a3adab12-0a85-47f8-a8f0-ccbc0b4b275d-kube-api-access-b9p9q\") on node \"crc\" DevicePath \"\"" Mar 10 23:20:05 crc kubenswrapper[4919]: I0310 23:20:05.555398 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553080-5z76m" event={"ID":"a3adab12-0a85-47f8-a8f0-ccbc0b4b275d","Type":"ContainerDied","Data":"89b628b3751fc96ae8e1a337b8287364881deadae709a414f3a01d7e9404c4fb"} Mar 10 23:20:05 crc kubenswrapper[4919]: I0310 23:20:05.555431 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b628b3751fc96ae8e1a337b8287364881deadae709a414f3a01d7e9404c4fb" Mar 10 23:20:05 crc kubenswrapper[4919]: I0310 23:20:05.555455 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553080-5z76m" Mar 10 23:20:05 crc kubenswrapper[4919]: I0310 23:20:05.609866 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553074-w4sgs"] Mar 10 23:20:05 crc kubenswrapper[4919]: I0310 23:20:05.615378 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553074-w4sgs"] Mar 10 23:20:07 crc kubenswrapper[4919]: I0310 23:20:07.497269 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15dd46e4-0f3d-43d2-9c42-f2538297c74f" path="/var/lib/kubelet/pods/15dd46e4-0f3d-43d2-9c42-f2538297c74f/volumes" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.283035 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 23:20:15 crc kubenswrapper[4919]: E0310 23:20:15.284510 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3adab12-0a85-47f8-a8f0-ccbc0b4b275d" containerName="oc" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.284546 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3adab12-0a85-47f8-a8f0-ccbc0b4b275d" containerName="oc" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.284829 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3adab12-0a85-47f8-a8f0-ccbc0b4b275d" containerName="oc" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.285743 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.289565 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dx7r9" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.297047 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.433495 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wghx5\" (UniqueName: \"kubernetes.io/projected/1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19-kube-api-access-wghx5\") pod \"mariadb-copy-data\" (UID: \"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19\") " pod="openstack/mariadb-copy-data" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.433556 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdd67fa2-54ad-4c98-8716-97722121e43d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdd67fa2-54ad-4c98-8716-97722121e43d\") pod \"mariadb-copy-data\" (UID: \"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19\") " pod="openstack/mariadb-copy-data" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.537319 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wghx5\" (UniqueName: \"kubernetes.io/projected/1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19-kube-api-access-wghx5\") pod \"mariadb-copy-data\" (UID: \"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19\") " pod="openstack/mariadb-copy-data" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.537383 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdd67fa2-54ad-4c98-8716-97722121e43d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdd67fa2-54ad-4c98-8716-97722121e43d\") pod \"mariadb-copy-data\" (UID: \"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19\") " pod="openstack/mariadb-copy-data" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.540718 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.540747 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdd67fa2-54ad-4c98-8716-97722121e43d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdd67fa2-54ad-4c98-8716-97722121e43d\") pod \"mariadb-copy-data\" (UID: \"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a1adcddab143d268670208f0400a9fe0404933c023cb068abd16d144649926ea/globalmount\"" pod="openstack/mariadb-copy-data" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.565284 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wghx5\" (UniqueName: \"kubernetes.io/projected/1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19-kube-api-access-wghx5\") pod \"mariadb-copy-data\" (UID: \"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19\") " pod="openstack/mariadb-copy-data" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.581517 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdd67fa2-54ad-4c98-8716-97722121e43d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdd67fa2-54ad-4c98-8716-97722121e43d\") pod \"mariadb-copy-data\" (UID: \"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19\") " pod="openstack/mariadb-copy-data" Mar 10 23:20:15 crc kubenswrapper[4919]: I0310 23:20:15.626955 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 23:20:16 crc kubenswrapper[4919]: I0310 23:20:16.185843 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 23:20:16 crc kubenswrapper[4919]: I0310 23:20:16.659468 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19","Type":"ContainerStarted","Data":"059c21102726da877bfaed01dc944d15f2106b89790d51ef0e6f2c5d7aa76420"} Mar 10 23:20:16 crc kubenswrapper[4919]: I0310 23:20:16.660039 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19","Type":"ContainerStarted","Data":"96f73e9f72688b497b4254cdacbbe526f03267d8fbb10457ac485ba7cc9baad8"} Mar 10 23:20:16 crc kubenswrapper[4919]: I0310 23:20:16.680201 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.680170525 podStartE2EDuration="2.680170525s" podCreationTimestamp="2026-03-10 23:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:20:16.676418493 +0000 UTC m=+5403.918299141" watchObservedRunningTime="2026-03-10 23:20:16.680170525 +0000 UTC m=+5403.922051173" Mar 10 23:20:19 crc kubenswrapper[4919]: I0310 23:20:19.558868 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:19 crc kubenswrapper[4919]: I0310 23:20:19.561174 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:20:19 crc kubenswrapper[4919]: I0310 23:20:19.569632 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:19 crc kubenswrapper[4919]: I0310 23:20:19.722725 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc5pr\" (UniqueName: \"kubernetes.io/projected/867ec0dd-fa15-4c70-979a-93a4eb6d4ac8-kube-api-access-jc5pr\") pod \"mariadb-client\" (UID: \"867ec0dd-fa15-4c70-979a-93a4eb6d4ac8\") " pod="openstack/mariadb-client" Mar 10 23:20:19 crc kubenswrapper[4919]: I0310 23:20:19.824748 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc5pr\" (UniqueName: \"kubernetes.io/projected/867ec0dd-fa15-4c70-979a-93a4eb6d4ac8-kube-api-access-jc5pr\") pod \"mariadb-client\" (UID: \"867ec0dd-fa15-4c70-979a-93a4eb6d4ac8\") " pod="openstack/mariadb-client" Mar 10 23:20:19 crc kubenswrapper[4919]: I0310 23:20:19.848699 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc5pr\" (UniqueName: \"kubernetes.io/projected/867ec0dd-fa15-4c70-979a-93a4eb6d4ac8-kube-api-access-jc5pr\") pod \"mariadb-client\" (UID: \"867ec0dd-fa15-4c70-979a-93a4eb6d4ac8\") " pod="openstack/mariadb-client" Mar 10 23:20:19 crc kubenswrapper[4919]: I0310 23:20:19.910836 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:20:20 crc kubenswrapper[4919]: I0310 23:20:20.207761 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:20 crc kubenswrapper[4919]: W0310 23:20:20.209672 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod867ec0dd_fa15_4c70_979a_93a4eb6d4ac8.slice/crio-1a92eb19bd98e3f7ce2fa5b09d0af332b68331e73a285a8d9236305e8bfcb6c6 WatchSource:0}: Error finding container 1a92eb19bd98e3f7ce2fa5b09d0af332b68331e73a285a8d9236305e8bfcb6c6: Status 404 returned error can't find the container with id 1a92eb19bd98e3f7ce2fa5b09d0af332b68331e73a285a8d9236305e8bfcb6c6 Mar 10 23:20:20 crc kubenswrapper[4919]: I0310 23:20:20.697697 4919 generic.go:334] "Generic (PLEG): container finished" podID="867ec0dd-fa15-4c70-979a-93a4eb6d4ac8" containerID="00fc2bae4ab7c8ec504ccf63f06086294dcffabc24ea6aa26e253e1d55b64937" exitCode=0 Mar 10 23:20:20 crc kubenswrapper[4919]: I0310 23:20:20.697748 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"867ec0dd-fa15-4c70-979a-93a4eb6d4ac8","Type":"ContainerDied","Data":"00fc2bae4ab7c8ec504ccf63f06086294dcffabc24ea6aa26e253e1d55b64937"} Mar 10 23:20:20 crc kubenswrapper[4919]: I0310 23:20:20.697778 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"867ec0dd-fa15-4c70-979a-93a4eb6d4ac8","Type":"ContainerStarted","Data":"1a92eb19bd98e3f7ce2fa5b09d0af332b68331e73a285a8d9236305e8bfcb6c6"} Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.130216 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.153858 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_867ec0dd-fa15-4c70-979a-93a4eb6d4ac8/mariadb-client/0.log" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.181203 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.186115 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.266167 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc5pr\" (UniqueName: \"kubernetes.io/projected/867ec0dd-fa15-4c70-979a-93a4eb6d4ac8-kube-api-access-jc5pr\") pod \"867ec0dd-fa15-4c70-979a-93a4eb6d4ac8\" (UID: \"867ec0dd-fa15-4c70-979a-93a4eb6d4ac8\") " Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.272140 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867ec0dd-fa15-4c70-979a-93a4eb6d4ac8-kube-api-access-jc5pr" (OuterVolumeSpecName: "kube-api-access-jc5pr") pod "867ec0dd-fa15-4c70-979a-93a4eb6d4ac8" (UID: "867ec0dd-fa15-4c70-979a-93a4eb6d4ac8"). InnerVolumeSpecName "kube-api-access-jc5pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.319680 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:22 crc kubenswrapper[4919]: E0310 23:20:22.320543 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867ec0dd-fa15-4c70-979a-93a4eb6d4ac8" containerName="mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.320569 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="867ec0dd-fa15-4c70-979a-93a4eb6d4ac8" containerName="mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.321026 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="867ec0dd-fa15-4c70-979a-93a4eb6d4ac8" containerName="mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.322470 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.342696 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.368036 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc5pr\" (UniqueName: \"kubernetes.io/projected/867ec0dd-fa15-4c70-979a-93a4eb6d4ac8-kube-api-access-jc5pr\") on node \"crc\" DevicePath \"\"" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.469748 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcgwn\" (UniqueName: \"kubernetes.io/projected/bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1-kube-api-access-fcgwn\") pod \"mariadb-client\" (UID: \"bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1\") " pod="openstack/mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.572331 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcgwn\" (UniqueName: \"kubernetes.io/projected/bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1-kube-api-access-fcgwn\") pod \"mariadb-client\" (UID: \"bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1\") " pod="openstack/mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.604024 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcgwn\" (UniqueName: \"kubernetes.io/projected/bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1-kube-api-access-fcgwn\") pod \"mariadb-client\" (UID: \"bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1\") " pod="openstack/mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.644738 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.724272 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a92eb19bd98e3f7ce2fa5b09d0af332b68331e73a285a8d9236305e8bfcb6c6" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.724331 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:20:22 crc kubenswrapper[4919]: I0310 23:20:22.745341 4919 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="867ec0dd-fa15-4c70-979a-93a4eb6d4ac8" podUID="bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1" Mar 10 23:20:23 crc kubenswrapper[4919]: W0310 23:20:23.061578 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc19e404_8c11_4e4b_8d4a_28ade8ce2ac1.slice/crio-1d7d627b6fd4298632de7e171d6b96658270c63f768510e026f181e606858ab9 WatchSource:0}: Error finding container 1d7d627b6fd4298632de7e171d6b96658270c63f768510e026f181e606858ab9: Status 404 returned error can't find the container with id 1d7d627b6fd4298632de7e171d6b96658270c63f768510e026f181e606858ab9 Mar 10 23:20:23 crc kubenswrapper[4919]: I0310 23:20:23.069929 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:23 crc kubenswrapper[4919]: I0310 23:20:23.497737 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867ec0dd-fa15-4c70-979a-93a4eb6d4ac8" path="/var/lib/kubelet/pods/867ec0dd-fa15-4c70-979a-93a4eb6d4ac8/volumes" Mar 10 23:20:23 crc kubenswrapper[4919]: I0310 23:20:23.738638 4919 generic.go:334] "Generic (PLEG): container finished" podID="bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1" containerID="4d4f6c143955680d3dd8147c989cc18c7ef6bbe7c7a6653c21c9e95b7d2cfefb" exitCode=0 Mar 10 23:20:23 crc kubenswrapper[4919]: I0310 23:20:23.738715 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1","Type":"ContainerDied","Data":"4d4f6c143955680d3dd8147c989cc18c7ef6bbe7c7a6653c21c9e95b7d2cfefb"} Mar 10 23:20:23 crc kubenswrapper[4919]: I0310 23:20:23.738769 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1","Type":"ContainerStarted","Data":"1d7d627b6fd4298632de7e171d6b96658270c63f768510e026f181e606858ab9"} Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.110186 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.132372 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1/mariadb-client/0.log" Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.160918 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.165649 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.220373 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcgwn\" (UniqueName: \"kubernetes.io/projected/bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1-kube-api-access-fcgwn\") pod \"bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1\" (UID: \"bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1\") " Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.226008 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1-kube-api-access-fcgwn" (OuterVolumeSpecName: "kube-api-access-fcgwn") pod "bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1" (UID: "bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1"). InnerVolumeSpecName "kube-api-access-fcgwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.322018 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcgwn\" (UniqueName: \"kubernetes.io/projected/bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1-kube-api-access-fcgwn\") on node \"crc\" DevicePath \"\"" Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.495588 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1" path="/var/lib/kubelet/pods/bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1/volumes" Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.757591 4919 scope.go:117] "RemoveContainer" containerID="4d4f6c143955680d3dd8147c989cc18c7ef6bbe7c7a6653c21c9e95b7d2cfefb" Mar 10 23:20:25 crc kubenswrapper[4919]: I0310 23:20:25.757640 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 23:20:29 crc kubenswrapper[4919]: I0310 23:20:29.175934 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:20:29 crc kubenswrapper[4919]: I0310 23:20:29.176577 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:20:47 crc kubenswrapper[4919]: I0310 23:20:47.088039 4919 scope.go:117] "RemoveContainer" containerID="aa5de3339c47524494914d6e6d673b75c0f7cc4913f4875021340bef57256bb5" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.876685 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 23:20:58 crc kubenswrapper[4919]: E0310 23:20:58.877766 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1" containerName="mariadb-client" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.877787 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1" containerName="mariadb-client" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.878070 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc19e404-8c11-4e4b-8d4a-28ade8ce2ac1" containerName="mariadb-client" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.879322 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.881586 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.884196 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bzgv8" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.884210 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.886943 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.891796 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.908998 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.912174 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.953216 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.958168 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.968795 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.983837 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 23:20:58 crc kubenswrapper[4919]: I0310 23:20:58.999339 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004013 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b407bebc-85fa-422e-a5ec-e6d586e4ae11-config\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004072 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40dccac8-8ebf-4106-8211-8baffde0a119-config\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004102 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mw7\" (UniqueName: \"kubernetes.io/projected/40dccac8-8ebf-4106-8211-8baffde0a119-kube-api-access-g9mw7\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004141 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-78cf8e6d-9837-433a-bfe6-9ce8b9e45945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78cf8e6d-9837-433a-bfe6-9ce8b9e45945\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004169 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004194 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004220 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004244 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004268 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkms\" (UniqueName: \"kubernetes.io/projected/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-kube-api-access-rvkms\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004292 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9z4b\" (UniqueName: \"kubernetes.io/projected/b407bebc-85fa-422e-a5ec-e6d586e4ae11-kube-api-access-j9z4b\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004322 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004552 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004602 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-config\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004626 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40dccac8-8ebf-4106-8211-8baffde0a119-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004706 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004735 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004769 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b407bebc-85fa-422e-a5ec-e6d586e4ae11-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004794 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004873 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004908 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40dccac8-8ebf-4106-8211-8baffde0a119-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004950 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b407bebc-85fa-422e-a5ec-e6d586e4ae11-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.004974 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.005053 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7db920af-94c6-4cb1-9a7c-312572e19d1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7db920af-94c6-4cb1-9a7c-312572e19d1c\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.005107 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e8c4f002-ee0c-4950-b4d2-55a118c76e1e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e8c4f002-ee0c-4950-b4d2-55a118c76e1e\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106419 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7db920af-94c6-4cb1-9a7c-312572e19d1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7db920af-94c6-4cb1-9a7c-312572e19d1c\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106476 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e8c4f002-ee0c-4950-b4d2-55a118c76e1e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e8c4f002-ee0c-4950-b4d2-55a118c76e1e\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106507 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b407bebc-85fa-422e-a5ec-e6d586e4ae11-config\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106529 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40dccac8-8ebf-4106-8211-8baffde0a119-config\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106545 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mw7\" (UniqueName: \"kubernetes.io/projected/40dccac8-8ebf-4106-8211-8baffde0a119-kube-api-access-g9mw7\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106573 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-78cf8e6d-9837-433a-bfe6-9ce8b9e45945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78cf8e6d-9837-433a-bfe6-9ce8b9e45945\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106591 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106611 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106630 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106649 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106667 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkms\" (UniqueName: \"kubernetes.io/projected/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-kube-api-access-rvkms\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106684 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9z4b\" (UniqueName: \"kubernetes.io/projected/b407bebc-85fa-422e-a5ec-e6d586e4ae11-kube-api-access-j9z4b\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106704 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106721 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106739 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40dccac8-8ebf-4106-8211-8baffde0a119-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106757 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-config\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106778 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106796 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106813 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b407bebc-85fa-422e-a5ec-e6d586e4ae11-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106830 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106851 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106870 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40dccac8-8ebf-4106-8211-8baffde0a119-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106890 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b407bebc-85fa-422e-a5ec-e6d586e4ae11-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.106905 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.107490 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b407bebc-85fa-422e-a5ec-e6d586e4ae11-config\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.107894 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b407bebc-85fa-422e-a5ec-e6d586e4ae11-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.108090 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/40dccac8-8ebf-4106-8211-8baffde0a119-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.108234 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40dccac8-8ebf-4106-8211-8baffde0a119-config\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.108775 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40dccac8-8ebf-4106-8211-8baffde0a119-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.109129 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.109369 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b407bebc-85fa-422e-a5ec-e6d586e4ae11-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.110254 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.112595 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-config\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.113524 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.113576 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e8c4f002-ee0c-4950-b4d2-55a118c76e1e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e8c4f002-ee0c-4950-b4d2-55a118c76e1e\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2be5fedf54a79607ae8507a30ca6b54a02d812039e4e0e58af2aaa819c6af2e6/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.114859 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.114923 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7db920af-94c6-4cb1-9a7c-312572e19d1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7db920af-94c6-4cb1-9a7c-312572e19d1c\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dc1cbfcbd5331734a72fbd851ae1ea1dbca0365042f310096e98e92972c9460f/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.114863 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.116315 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.116520 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.116865 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.119556 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.119623 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-78cf8e6d-9837-433a-bfe6-9ce8b9e45945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78cf8e6d-9837-433a-bfe6-9ce8b9e45945\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1db6c1bb0e8b51ce21b1314ee352a1dc976db440991d794d30cbb3c096eaf039/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.121387 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.124103 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.126301 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mw7\" (UniqueName: \"kubernetes.io/projected/40dccac8-8ebf-4106-8211-8baffde0a119-kube-api-access-g9mw7\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.126614 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.127986 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b407bebc-85fa-422e-a5ec-e6d586e4ae11-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.130150 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkms\" (UniqueName: \"kubernetes.io/projected/cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a-kube-api-access-rvkms\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.131097 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9z4b\" (UniqueName: \"kubernetes.io/projected/b407bebc-85fa-422e-a5ec-e6d586e4ae11-kube-api-access-j9z4b\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.132305 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40dccac8-8ebf-4106-8211-8baffde0a119-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.154623 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e8c4f002-ee0c-4950-b4d2-55a118c76e1e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e8c4f002-ee0c-4950-b4d2-55a118c76e1e\") pod \"ovsdbserver-nb-1\" (UID: \"40dccac8-8ebf-4106-8211-8baffde0a119\") " pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.161584 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-78cf8e6d-9837-433a-bfe6-9ce8b9e45945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78cf8e6d-9837-433a-bfe6-9ce8b9e45945\") pod \"ovsdbserver-nb-2\" (UID: \"b407bebc-85fa-422e-a5ec-e6d586e4ae11\") " pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.171513 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7db920af-94c6-4cb1-9a7c-312572e19d1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7db920af-94c6-4cb1-9a7c-312572e19d1c\") pod \"ovsdbserver-nb-0\" (UID: \"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a\") " pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.175874 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.175932 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.175980 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.176689 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ed85a234e315a1e8c0b68df55722ea097e0f8687391361bc1b4250e4cb84b0a"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.176744 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://5ed85a234e315a1e8c0b68df55722ea097e0f8687391361bc1b4250e4cb84b0a" gracePeriod=600 Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.211483 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.244988 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.278189 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.801860 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 23:20:59 crc kubenswrapper[4919]: W0310 23:20:59.813518 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf54d78f_9a52_4e30_9f54_ebbe74ad8c6a.slice/crio-1c5d0a0c360a61b7033d1fc27d964d93d0e1d6982d917163a09ed22d1187b22a WatchSource:0}: Error finding container 1c5d0a0c360a61b7033d1fc27d964d93d0e1d6982d917163a09ed22d1187b22a: Status 404 returned error can't find the container with id 1c5d0a0c360a61b7033d1fc27d964d93d0e1d6982d917163a09ed22d1187b22a Mar 10 23:20:59 crc kubenswrapper[4919]: I0310 23:20:59.906906 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.002688 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 23:21:00 crc kubenswrapper[4919]: W0310 23:21:00.009743 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40dccac8_8ebf_4106_8211_8baffde0a119.slice/crio-f277247a045a133c858b01fea00fcb295389aab350f612e64536dbc54c85cb82 WatchSource:0}: Error finding container f277247a045a133c858b01fea00fcb295389aab350f612e64536dbc54c85cb82: Status 404 returned error can't find the container with id f277247a045a133c858b01fea00fcb295389aab350f612e64536dbc54c85cb82 Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.062304 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"b407bebc-85fa-422e-a5ec-e6d586e4ae11","Type":"ContainerStarted","Data":"067a286ec7ef53064b309c0cf51f98e458fd0baa65f74025c757c94b62959554"} Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.064226 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"40dccac8-8ebf-4106-8211-8baffde0a119","Type":"ContainerStarted","Data":"f277247a045a133c858b01fea00fcb295389aab350f612e64536dbc54c85cb82"} Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.069282 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="5ed85a234e315a1e8c0b68df55722ea097e0f8687391361bc1b4250e4cb84b0a" exitCode=0 Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.069558 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"5ed85a234e315a1e8c0b68df55722ea097e0f8687391361bc1b4250e4cb84b0a"} Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.069792 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc"} Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.069904 4919 scope.go:117] "RemoveContainer" containerID="db23846019f61e1fa4301ac7d8406453060a07c19e1dddc6fc1714cf596701e8" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.077899 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a","Type":"ContainerStarted","Data":"305c33499385c11167667ce6f80f30ed905d76a88b28d0db7e8ee60aecaafb01"} Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.077937 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a","Type":"ContainerStarted","Data":"1c5d0a0c360a61b7033d1fc27d964d93d0e1d6982d917163a09ed22d1187b22a"} Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.587772 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.589074 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.591684 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.591947 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-m9gwn" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.592101 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.593348 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.598257 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.599555 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.605023 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.606282 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.610782 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.619479 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.643938 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729602 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8746d615-5c8a-463f-9b24-b8a4e86fd413-config\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729653 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8746d615-5c8a-463f-9b24-b8a4e86fd413-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729686 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fe180f0f-3104-4089-a437-5695a087ddb3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe180f0f-3104-4089-a437-5695a087ddb3\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729711 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzp2k\" (UniqueName: \"kubernetes.io/projected/814add4c-f2ef-480d-b701-5c1ea6b8a834-kube-api-access-vzp2k\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729732 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f31594a5-b70b-462f-ae7e-b2c16f2655bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f31594a5-b70b-462f-ae7e-b2c16f2655bf\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729753 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/814add4c-f2ef-480d-b701-5c1ea6b8a834-config\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729771 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/814add4c-f2ef-480d-b701-5c1ea6b8a834-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729787 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729807 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqtrz\" (UniqueName: \"kubernetes.io/projected/8746d615-5c8a-463f-9b24-b8a4e86fd413-kube-api-access-fqtrz\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729827 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8746d615-5c8a-463f-9b24-b8a4e86fd413-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729855 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729882 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/814add4c-f2ef-480d-b701-5c1ea6b8a834-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729900 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.729995 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730047 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730073 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730097 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-config\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730128 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730235 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730267 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730291 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e8e39a38-7b14-4289-9f54-e1e1ddb4fb63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e8e39a38-7b14-4289-9f54-e1e1ddb4fb63\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730334 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730369 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwd54\" (UniqueName: \"kubernetes.io/projected/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-kube-api-access-rwd54\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.730407 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835308 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzp2k\" (UniqueName: \"kubernetes.io/projected/814add4c-f2ef-480d-b701-5c1ea6b8a834-kube-api-access-vzp2k\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835675 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f31594a5-b70b-462f-ae7e-b2c16f2655bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f31594a5-b70b-462f-ae7e-b2c16f2655bf\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835724 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/814add4c-f2ef-480d-b701-5c1ea6b8a834-config\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835747 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/814add4c-f2ef-480d-b701-5c1ea6b8a834-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835779 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835818 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqtrz\" (UniqueName: \"kubernetes.io/projected/8746d615-5c8a-463f-9b24-b8a4e86fd413-kube-api-access-fqtrz\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835850 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8746d615-5c8a-463f-9b24-b8a4e86fd413-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835916 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.835986 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/814add4c-f2ef-480d-b701-5c1ea6b8a834-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836016 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836063 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836091 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836113 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836133 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-config\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836165 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836231 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836267 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836302 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e8e39a38-7b14-4289-9f54-e1e1ddb4fb63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e8e39a38-7b14-4289-9f54-e1e1ddb4fb63\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836325 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836348 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwd54\" (UniqueName: \"kubernetes.io/projected/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-kube-api-access-rwd54\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836401 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836436 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8746d615-5c8a-463f-9b24-b8a4e86fd413-config\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836479 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8746d615-5c8a-463f-9b24-b8a4e86fd413-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836529 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fe180f0f-3104-4089-a437-5695a087ddb3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe180f0f-3104-4089-a437-5695a087ddb3\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.836750 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/814add4c-f2ef-480d-b701-5c1ea6b8a834-config\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.837959 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/814add4c-f2ef-480d-b701-5c1ea6b8a834-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.841951 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8746d615-5c8a-463f-9b24-b8a4e86fd413-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.842710 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8746d615-5c8a-463f-9b24-b8a4e86fd413-config\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.843353 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.843835 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.843883 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f31594a5-b70b-462f-ae7e-b2c16f2655bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f31594a5-b70b-462f-ae7e-b2c16f2655bf\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d61213ca1fec37e1d411d53d63f95a1d4e6dda62e005a38476e1b2deb53e92c2/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.844663 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8746d615-5c8a-463f-9b24-b8a4e86fd413-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.845235 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.845509 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.845573 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/814add4c-f2ef-480d-b701-5c1ea6b8a834-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.846720 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.846866 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.846874 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-config\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.847774 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.854732 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.854767 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e8e39a38-7b14-4289-9f54-e1e1ddb4fb63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e8e39a38-7b14-4289-9f54-e1e1ddb4fb63\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0087590dfb2edda9459aa18600a4f8c6f4a586641ac5ea05e67d73cb53d6706f/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.854897 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.854944 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fe180f0f-3104-4089-a437-5695a087ddb3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe180f0f-3104-4089-a437-5695a087ddb3\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6bbd84935818969fb9fd96eb1ac69d45e5ffb23e117a3c021579a70805aa5063/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.859754 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.862107 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwd54\" (UniqueName: \"kubernetes.io/projected/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-kube-api-access-rwd54\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.863332 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.864685 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814add4c-f2ef-480d-b701-5c1ea6b8a834-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.864831 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c2cff6-2a56-4b36-b872-cdafb3bf419a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.865270 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8746d615-5c8a-463f-9b24-b8a4e86fd413-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.865835 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqtrz\" (UniqueName: \"kubernetes.io/projected/8746d615-5c8a-463f-9b24-b8a4e86fd413-kube-api-access-fqtrz\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.867522 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzp2k\" (UniqueName: \"kubernetes.io/projected/814add4c-f2ef-480d-b701-5c1ea6b8a834-kube-api-access-vzp2k\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.888718 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fe180f0f-3104-4089-a437-5695a087ddb3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe180f0f-3104-4089-a437-5695a087ddb3\") pod \"ovsdbserver-sb-0\" (UID: \"814add4c-f2ef-480d-b701-5c1ea6b8a834\") " pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.899653 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f31594a5-b70b-462f-ae7e-b2c16f2655bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f31594a5-b70b-462f-ae7e-b2c16f2655bf\") pod \"ovsdbserver-sb-2\" (UID: \"c0c2cff6-2a56-4b36-b872-cdafb3bf419a\") " pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.901528 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.903722 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e8e39a38-7b14-4289-9f54-e1e1ddb4fb63\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e8e39a38-7b14-4289-9f54-e1e1ddb4fb63\") pod \"ovsdbserver-sb-1\" (UID: \"8746d615-5c8a-463f-9b24-b8a4e86fd413\") " pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.970022 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:00 crc kubenswrapper[4919]: I0310 23:21:00.977300 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.108776 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a","Type":"ContainerStarted","Data":"f81be565efcb69e653e8c1eebe307e79be4d2f95b6c7d9143527eb9c310e5f67"} Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.140725 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.140707621 podStartE2EDuration="4.140707621s" podCreationTimestamp="2026-03-10 23:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:01.138372867 +0000 UTC m=+5448.380253465" watchObservedRunningTime="2026-03-10 23:21:01.140707621 +0000 UTC m=+5448.382588229" Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.159527 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"b407bebc-85fa-422e-a5ec-e6d586e4ae11","Type":"ContainerStarted","Data":"66ab61bf9e5bb93cde9cf19a08ccdfccdf2d93e8210a63ea92243c65b639cc01"} Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.159591 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"b407bebc-85fa-422e-a5ec-e6d586e4ae11","Type":"ContainerStarted","Data":"12dc730f3590cc9d2b414b25f4015001c413f8f5d1992f9a31c794e8f68e88fd"} Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.164826 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"40dccac8-8ebf-4106-8211-8baffde0a119","Type":"ContainerStarted","Data":"842b633b73705e57bd78c8998fecfaee9bb9088af2852e0bb54572bf551d93dc"} Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.164868 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"40dccac8-8ebf-4106-8211-8baffde0a119","Type":"ContainerStarted","Data":"5214a143ea7ec3719882e2788e6ceaaaeb65afb7c08d567d0bdaf794be39ce47"} Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.182851 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.182832369 podStartE2EDuration="4.182832369s" podCreationTimestamp="2026-03-10 23:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:01.17919784 +0000 UTC m=+5448.421078448" watchObservedRunningTime="2026-03-10 23:21:01.182832369 +0000 UTC m=+5448.424712977" Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.214050 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.214020159 podStartE2EDuration="4.214020159s" podCreationTimestamp="2026-03-10 23:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:01.202552846 +0000 UTC m=+5448.444433474" watchObservedRunningTime="2026-03-10 23:21:01.214020159 +0000 UTC m=+5448.455900787" Mar 10 23:21:01 crc kubenswrapper[4919]: W0310 23:21:01.404588 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814add4c_f2ef_480d_b701_5c1ea6b8a834.slice/crio-68cab553623a3c874594f4994dcea12cbd82dff4307523bc83bdddd54006ae62 WatchSource:0}: Error finding container 68cab553623a3c874594f4994dcea12cbd82dff4307523bc83bdddd54006ae62: Status 404 returned error can't find the container with id 68cab553623a3c874594f4994dcea12cbd82dff4307523bc83bdddd54006ae62 Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.404751 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.548729 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 23:21:01 crc kubenswrapper[4919]: I0310 23:21:01.652131 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 23:21:01 crc kubenswrapper[4919]: W0310 23:21:01.656028 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8746d615_5c8a_463f_9b24_b8a4e86fd413.slice/crio-dd0ee384aab6f6f31e2b4294de890afe269d6fca681d71be18282dbc07447950 WatchSource:0}: Error finding container dd0ee384aab6f6f31e2b4294de890afe269d6fca681d71be18282dbc07447950: Status 404 returned error can't find the container with id dd0ee384aab6f6f31e2b4294de890afe269d6fca681d71be18282dbc07447950 Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.176384 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"c0c2cff6-2a56-4b36-b872-cdafb3bf419a","Type":"ContainerStarted","Data":"5a9b3cdd5800ddd72979b5b8b76137a023049ef8eeafe11058da17b17172285e"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.176850 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"c0c2cff6-2a56-4b36-b872-cdafb3bf419a","Type":"ContainerStarted","Data":"8f6f2a64dd954d1b5d59b6eddadd45a4a59b18d9e1e42c0289c2abc34985d826"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.176867 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"c0c2cff6-2a56-4b36-b872-cdafb3bf419a","Type":"ContainerStarted","Data":"eb62c23cf892f26de2a5b2e217c9018409e7119f3140348a75dda8c147f272ab"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.179299 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"8746d615-5c8a-463f-9b24-b8a4e86fd413","Type":"ContainerStarted","Data":"8ad73cb825f8b2111ad64d1e5257b4f1ad100f2fb58c62126d4056abb4d6ebb8"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.179323 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"8746d615-5c8a-463f-9b24-b8a4e86fd413","Type":"ContainerStarted","Data":"33e17decb117bfb4582d41006cd85430c37ab3eeb0a925b949fe31d22ddefcbe"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.179334 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"8746d615-5c8a-463f-9b24-b8a4e86fd413","Type":"ContainerStarted","Data":"dd0ee384aab6f6f31e2b4294de890afe269d6fca681d71be18282dbc07447950"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.180762 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"814add4c-f2ef-480d-b701-5c1ea6b8a834","Type":"ContainerStarted","Data":"e7d9540da63c4a1178dd8bb63920f3d0116136ae1b90561021033a9b6a287a8a"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.180808 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"814add4c-f2ef-480d-b701-5c1ea6b8a834","Type":"ContainerStarted","Data":"f9ae97fc49fdc4c8be91dd3af2de1eadb0b82fd5e1a162b4af5981c3d769f969"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.180825 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"814add4c-f2ef-480d-b701-5c1ea6b8a834","Type":"ContainerStarted","Data":"68cab553623a3c874594f4994dcea12cbd82dff4307523bc83bdddd54006ae62"} Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.201108 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.201082264 podStartE2EDuration="3.201082264s" podCreationTimestamp="2026-03-10 23:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:02.192757357 +0000 UTC m=+5449.434637965" watchObservedRunningTime="2026-03-10 23:21:02.201082264 +0000 UTC m=+5449.442962912" Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.213828 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.225159 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.22513634 podStartE2EDuration="3.22513634s" podCreationTimestamp="2026-03-10 23:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:02.214037728 +0000 UTC m=+5449.455918336" watchObservedRunningTime="2026-03-10 23:21:02.22513634 +0000 UTC m=+5449.467016948" Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.237664 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.23763681 podStartE2EDuration="3.23763681s" podCreationTimestamp="2026-03-10 23:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:02.233720594 +0000 UTC m=+5449.475601202" watchObservedRunningTime="2026-03-10 23:21:02.23763681 +0000 UTC m=+5449.479517438" Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.245753 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.260222 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 23:21:02 crc kubenswrapper[4919]: I0310 23:21:02.279670 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 10 23:21:03 crc kubenswrapper[4919]: I0310 23:21:03.189752 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 23:21:03 crc kubenswrapper[4919]: I0310 23:21:03.902378 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:03 crc kubenswrapper[4919]: I0310 23:21:03.970950 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:03 crc kubenswrapper[4919]: I0310 23:21:03.979213 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.246537 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.250235 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.279783 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.525549 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-mbws7"] Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.526761 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.529362 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.550606 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-mbws7"] Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.608420 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-config\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.608524 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8zjk\" (UniqueName: \"kubernetes.io/projected/79633431-52da-4a16-9cdc-f7f7f9f75234-kube-api-access-r8zjk\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.608555 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-ovsdbserver-nb\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.608586 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-dns-svc\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.710242 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-ovsdbserver-nb\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.710305 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-dns-svc\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.710408 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-config\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.710464 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8zjk\" (UniqueName: \"kubernetes.io/projected/79633431-52da-4a16-9cdc-f7f7f9f75234-kube-api-access-r8zjk\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.711049 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-ovsdbserver-nb\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.711174 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-config\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.711302 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-dns-svc\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.731899 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8zjk\" (UniqueName: \"kubernetes.io/projected/79633431-52da-4a16-9cdc-f7f7f9f75234-kube-api-access-r8zjk\") pod \"dnsmasq-dns-594d96f99f-mbws7\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:04 crc kubenswrapper[4919]: I0310 23:21:04.847056 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.064679 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-mbws7"] Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.207455 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" event={"ID":"79633431-52da-4a16-9cdc-f7f7f9f75234","Type":"ContainerStarted","Data":"5741cd78e32a7084b32a72e5bca7265b855eb8bd7d51e47b90958610b28956ee"} Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.285790 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.318033 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.331382 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.360088 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.902461 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.971221 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:05 crc kubenswrapper[4919]: I0310 23:21:05.978958 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:06 crc kubenswrapper[4919]: I0310 23:21:06.226016 4919 generic.go:334] "Generic (PLEG): container finished" podID="79633431-52da-4a16-9cdc-f7f7f9f75234" containerID="a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29" exitCode=0 Mar 10 23:21:06 crc kubenswrapper[4919]: I0310 23:21:06.226128 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" event={"ID":"79633431-52da-4a16-9cdc-f7f7f9f75234","Type":"ContainerDied","Data":"a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29"} Mar 10 23:21:06 crc kubenswrapper[4919]: I0310 23:21:06.966769 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.021438 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.024216 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.035145 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.075310 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.087980 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.223451 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-mbws7"] Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.240374 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" event={"ID":"79633431-52da-4a16-9cdc-f7f7f9f75234","Type":"ContainerStarted","Data":"f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7"} Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.258218 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68577db887-c75ws"] Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.259610 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.262650 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.276739 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68577db887-c75ws"] Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.285232 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" podStartSLOduration=3.285214694 podStartE2EDuration="3.285214694s" podCreationTimestamp="2026-03-10 23:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:07.257321824 +0000 UTC m=+5454.499202432" watchObservedRunningTime="2026-03-10 23:21:07.285214694 +0000 UTC m=+5454.527095302" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.296377 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-nb\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.297601 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqkl\" (UniqueName: \"kubernetes.io/projected/ec560d1e-297a-4ffb-bb92-8a5128c709a9-kube-api-access-lhqkl\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.297810 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-dns-svc\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.297962 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-sb\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.298028 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-config\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.399901 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-dns-svc\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.400017 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-sb\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.400069 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-config\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.400093 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-nb\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.400134 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhqkl\" (UniqueName: \"kubernetes.io/projected/ec560d1e-297a-4ffb-bb92-8a5128c709a9-kube-api-access-lhqkl\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.400937 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-sb\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.400951 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-nb\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.401196 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-config\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.402131 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-dns-svc\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.421352 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhqkl\" (UniqueName: \"kubernetes.io/projected/ec560d1e-297a-4ffb-bb92-8a5128c709a9-kube-api-access-lhqkl\") pod \"dnsmasq-dns-68577db887-c75ws\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:07 crc kubenswrapper[4919]: I0310 23:21:07.584062 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.050935 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68577db887-c75ws"] Mar 10 23:21:08 crc kubenswrapper[4919]: W0310 23:21:08.059646 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec560d1e_297a_4ffb_bb92_8a5128c709a9.slice/crio-147cdac924d2bd72e3bd1eaad62c0ffa3903f8b2ee9e4bb07bcdc67548990957 WatchSource:0}: Error finding container 147cdac924d2bd72e3bd1eaad62c0ffa3903f8b2ee9e4bb07bcdc67548990957: Status 404 returned error can't find the container with id 147cdac924d2bd72e3bd1eaad62c0ffa3903f8b2ee9e4bb07bcdc67548990957 Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.248470 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-c75ws" event={"ID":"ec560d1e-297a-4ffb-bb92-8a5128c709a9","Type":"ContainerStarted","Data":"147cdac924d2bd72e3bd1eaad62c0ffa3903f8b2ee9e4bb07bcdc67548990957"} Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.248675 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" podUID="79633431-52da-4a16-9cdc-f7f7f9f75234" containerName="dnsmasq-dns" containerID="cri-o://f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7" gracePeriod=10 Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.249028 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.874621 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.928778 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-config\") pod \"79633431-52da-4a16-9cdc-f7f7f9f75234\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.928827 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8zjk\" (UniqueName: \"kubernetes.io/projected/79633431-52da-4a16-9cdc-f7f7f9f75234-kube-api-access-r8zjk\") pod \"79633431-52da-4a16-9cdc-f7f7f9f75234\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.928896 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-ovsdbserver-nb\") pod \"79633431-52da-4a16-9cdc-f7f7f9f75234\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.928940 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-dns-svc\") pod \"79633431-52da-4a16-9cdc-f7f7f9f75234\" (UID: \"79633431-52da-4a16-9cdc-f7f7f9f75234\") " Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.937353 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79633431-52da-4a16-9cdc-f7f7f9f75234-kube-api-access-r8zjk" (OuterVolumeSpecName: "kube-api-access-r8zjk") pod "79633431-52da-4a16-9cdc-f7f7f9f75234" (UID: "79633431-52da-4a16-9cdc-f7f7f9f75234"). InnerVolumeSpecName "kube-api-access-r8zjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.966793 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79633431-52da-4a16-9cdc-f7f7f9f75234" (UID: "79633431-52da-4a16-9cdc-f7f7f9f75234"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.967771 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-config" (OuterVolumeSpecName: "config") pod "79633431-52da-4a16-9cdc-f7f7f9f75234" (UID: "79633431-52da-4a16-9cdc-f7f7f9f75234"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:08 crc kubenswrapper[4919]: I0310 23:21:08.970665 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79633431-52da-4a16-9cdc-f7f7f9f75234" (UID: "79633431-52da-4a16-9cdc-f7f7f9f75234"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.032833 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.032865 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.032875 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79633431-52da-4a16-9cdc-f7f7f9f75234-config\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.032885 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8zjk\" (UniqueName: \"kubernetes.io/projected/79633431-52da-4a16-9cdc-f7f7f9f75234-kube-api-access-r8zjk\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.256709 4919 generic.go:334] "Generic (PLEG): container finished" podID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" containerID="a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755" exitCode=0 Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.256770 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-c75ws" event={"ID":"ec560d1e-297a-4ffb-bb92-8a5128c709a9","Type":"ContainerDied","Data":"a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755"} Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.259310 4919 generic.go:334] "Generic (PLEG): container finished" podID="79633431-52da-4a16-9cdc-f7f7f9f75234" containerID="f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7" exitCode=0 Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.259360 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" event={"ID":"79633431-52da-4a16-9cdc-f7f7f9f75234","Type":"ContainerDied","Data":"f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7"} Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.259416 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" event={"ID":"79633431-52da-4a16-9cdc-f7f7f9f75234","Type":"ContainerDied","Data":"5741cd78e32a7084b32a72e5bca7265b855eb8bd7d51e47b90958610b28956ee"} Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.259440 4919 scope.go:117] "RemoveContainer" containerID="f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.259535 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594d96f99f-mbws7" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.285726 4919 scope.go:117] "RemoveContainer" containerID="a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.312122 4919 scope.go:117] "RemoveContainer" containerID="f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7" Mar 10 23:21:09 crc kubenswrapper[4919]: E0310 23:21:09.325210 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7\": container with ID starting with f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7 not found: ID does not exist" containerID="f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.325259 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7"} err="failed to get container status \"f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7\": rpc error: code = NotFound desc = could not find container \"f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7\": container with ID starting with f1b13c85913fed41a291927cda7a0eaa62938a74919ce82d446881bbfa37b6e7 not found: ID does not exist" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.325287 4919 scope.go:117] "RemoveContainer" containerID="a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29" Mar 10 23:21:09 crc kubenswrapper[4919]: E0310 23:21:09.325611 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29\": container with ID starting with a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29 not found: ID does not exist" containerID="a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.325660 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29"} err="failed to get container status \"a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29\": rpc error: code = NotFound desc = could not find container \"a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29\": container with ID starting with a6f09eb3bd53234f7d2ca905a7b3587b457fd009cf4ae0de9692da0bca669a29 not found: ID does not exist" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.336644 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-mbws7"] Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.345117 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-mbws7"] Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.491715 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79633431-52da-4a16-9cdc-f7f7f9f75234" path="/var/lib/kubelet/pods/79633431-52da-4a16-9cdc-f7f7f9f75234/volumes" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.640683 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 10 23:21:09 crc kubenswrapper[4919]: E0310 23:21:09.641298 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79633431-52da-4a16-9cdc-f7f7f9f75234" containerName="init" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.641328 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="79633431-52da-4a16-9cdc-f7f7f9f75234" containerName="init" Mar 10 23:21:09 crc kubenswrapper[4919]: E0310 23:21:09.641372 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79633431-52da-4a16-9cdc-f7f7f9f75234" containerName="dnsmasq-dns" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.641414 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="79633431-52da-4a16-9cdc-f7f7f9f75234" containerName="dnsmasq-dns" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.641716 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="79633431-52da-4a16-9cdc-f7f7f9f75234" containerName="dnsmasq-dns" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.642685 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.646758 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.647183 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.746752 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15ce6224-5083-4933-8288-7e0425a39f9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ce6224-5083-4933-8288-7e0425a39f9f\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.747038 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29h6l\" (UniqueName: \"kubernetes.io/projected/278a021a-4088-4dea-809d-3068fff9357b-kube-api-access-29h6l\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.747337 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/278a021a-4088-4dea-809d-3068fff9357b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.848837 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15ce6224-5083-4933-8288-7e0425a39f9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ce6224-5083-4933-8288-7e0425a39f9f\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.848974 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29h6l\" (UniqueName: \"kubernetes.io/projected/278a021a-4088-4dea-809d-3068fff9357b-kube-api-access-29h6l\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.849018 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/278a021a-4088-4dea-809d-3068fff9357b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.852233 4919 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.852299 4919 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15ce6224-5083-4933-8288-7e0425a39f9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ce6224-5083-4933-8288-7e0425a39f9f\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/18f958652f86d872b37e491ac53e52b70c594ab2b1b77c36c9d89d78467acdf6/globalmount\"" pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.854977 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/278a021a-4088-4dea-809d-3068fff9357b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.866257 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29h6l\" (UniqueName: \"kubernetes.io/projected/278a021a-4088-4dea-809d-3068fff9357b-kube-api-access-29h6l\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.879271 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15ce6224-5083-4933-8288-7e0425a39f9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ce6224-5083-4933-8288-7e0425a39f9f\") pod \"ovn-copy-data\" (UID: \"278a021a-4088-4dea-809d-3068fff9357b\") " pod="openstack/ovn-copy-data" Mar 10 23:21:09 crc kubenswrapper[4919]: I0310 23:21:09.960540 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 23:21:10 crc kubenswrapper[4919]: I0310 23:21:10.269228 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-c75ws" event={"ID":"ec560d1e-297a-4ffb-bb92-8a5128c709a9","Type":"ContainerStarted","Data":"89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef"} Mar 10 23:21:10 crc kubenswrapper[4919]: I0310 23:21:10.269484 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:10 crc kubenswrapper[4919]: I0310 23:21:10.297538 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68577db887-c75ws" podStartSLOduration=3.297471266 podStartE2EDuration="3.297471266s" podCreationTimestamp="2026-03-10 23:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:10.294383122 +0000 UTC m=+5457.536263760" watchObservedRunningTime="2026-03-10 23:21:10.297471266 +0000 UTC m=+5457.539351874" Mar 10 23:21:10 crc kubenswrapper[4919]: I0310 23:21:10.482994 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 23:21:10 crc kubenswrapper[4919]: W0310 23:21:10.487708 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod278a021a_4088_4dea_809d_3068fff9357b.slice/crio-d4546102981693ff9a2f29be5e9a1b103ceddcef887aa72fce154fcdb9be3c54 WatchSource:0}: Error finding container d4546102981693ff9a2f29be5e9a1b103ceddcef887aa72fce154fcdb9be3c54: Status 404 returned error can't find the container with id d4546102981693ff9a2f29be5e9a1b103ceddcef887aa72fce154fcdb9be3c54 Mar 10 23:21:10 crc kubenswrapper[4919]: I0310 23:21:10.490863 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 23:21:11 crc kubenswrapper[4919]: I0310 23:21:11.281766 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"278a021a-4088-4dea-809d-3068fff9357b","Type":"ContainerStarted","Data":"d4546102981693ff9a2f29be5e9a1b103ceddcef887aa72fce154fcdb9be3c54"} Mar 10 23:21:14 crc kubenswrapper[4919]: I0310 23:21:14.315815 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"278a021a-4088-4dea-809d-3068fff9357b","Type":"ContainerStarted","Data":"489fd19634ded346538f700f4ef176dc96ec733e5f0ac33f5c32d84c2dd3c00b"} Mar 10 23:21:14 crc kubenswrapper[4919]: I0310 23:21:14.341072 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.736766786 podStartE2EDuration="6.341045301s" podCreationTimestamp="2026-03-10 23:21:08 +0000 UTC" firstStartedPulling="2026-03-10 23:21:10.490610459 +0000 UTC m=+5457.732491077" lastFinishedPulling="2026-03-10 23:21:13.094888984 +0000 UTC m=+5460.336769592" observedRunningTime="2026-03-10 23:21:14.336169588 +0000 UTC m=+5461.578050196" watchObservedRunningTime="2026-03-10 23:21:14.341045301 +0000 UTC m=+5461.582925929" Mar 10 23:21:17 crc kubenswrapper[4919]: I0310 23:21:17.585805 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:17 crc kubenswrapper[4919]: I0310 23:21:17.710454 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-q568m"] Mar 10 23:21:17 crc kubenswrapper[4919]: I0310 23:21:17.710833 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" podUID="6b5a9723-b287-46d6-a9c7-136a560e3e38" containerName="dnsmasq-dns" containerID="cri-o://39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40" gracePeriod=10 Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.133797 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.298893 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-dns-svc\") pod \"6b5a9723-b287-46d6-a9c7-136a560e3e38\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.298984 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n4g7\" (UniqueName: \"kubernetes.io/projected/6b5a9723-b287-46d6-a9c7-136a560e3e38-kube-api-access-5n4g7\") pod \"6b5a9723-b287-46d6-a9c7-136a560e3e38\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.299009 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-config\") pod \"6b5a9723-b287-46d6-a9c7-136a560e3e38\" (UID: \"6b5a9723-b287-46d6-a9c7-136a560e3e38\") " Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.304567 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5a9723-b287-46d6-a9c7-136a560e3e38-kube-api-access-5n4g7" (OuterVolumeSpecName: "kube-api-access-5n4g7") pod "6b5a9723-b287-46d6-a9c7-136a560e3e38" (UID: "6b5a9723-b287-46d6-a9c7-136a560e3e38"). InnerVolumeSpecName "kube-api-access-5n4g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.338516 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b5a9723-b287-46d6-a9c7-136a560e3e38" (UID: "6b5a9723-b287-46d6-a9c7-136a560e3e38"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.339668 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-config" (OuterVolumeSpecName: "config") pod "6b5a9723-b287-46d6-a9c7-136a560e3e38" (UID: "6b5a9723-b287-46d6-a9c7-136a560e3e38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.362282 4919 generic.go:334] "Generic (PLEG): container finished" podID="6b5a9723-b287-46d6-a9c7-136a560e3e38" containerID="39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40" exitCode=0 Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.362323 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" event={"ID":"6b5a9723-b287-46d6-a9c7-136a560e3e38","Type":"ContainerDied","Data":"39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40"} Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.362348 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" event={"ID":"6b5a9723-b287-46d6-a9c7-136a560e3e38","Type":"ContainerDied","Data":"b2c7dbef18d4d750dc9278d95da8c47f1e853e4391aa894d24c50cc68e8f5983"} Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.362345 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-q568m" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.362365 4919 scope.go:117] "RemoveContainer" containerID="39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.400846 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n4g7\" (UniqueName: \"kubernetes.io/projected/6b5a9723-b287-46d6-a9c7-136a560e3e38-kube-api-access-5n4g7\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.400880 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-config\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.400893 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b5a9723-b287-46d6-a9c7-136a560e3e38-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.404030 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-q568m"] Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.410305 4919 scope.go:117] "RemoveContainer" containerID="da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.410635 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-q568m"] Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.433802 4919 scope.go:117] "RemoveContainer" containerID="39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40" Mar 10 23:21:18 crc kubenswrapper[4919]: E0310 23:21:18.434217 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40\": container with ID starting with 39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40 not found: ID does not exist" containerID="39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.434257 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40"} err="failed to get container status \"39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40\": rpc error: code = NotFound desc = could not find container \"39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40\": container with ID starting with 39ee255a5c86d773bbe6f095ea98eb731d0733b9d9f784153a66400205082b40 not found: ID does not exist" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.434283 4919 scope.go:117] "RemoveContainer" containerID="da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa" Mar 10 23:21:18 crc kubenswrapper[4919]: E0310 23:21:18.434652 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa\": container with ID starting with da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa not found: ID does not exist" containerID="da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa" Mar 10 23:21:18 crc kubenswrapper[4919]: I0310 23:21:18.434705 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa"} err="failed to get container status \"da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa\": rpc error: code = NotFound desc = could not find container \"da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa\": container with ID starting with da60792319079583978efb78a311173cf6aecb6382bc0bc852b6215e877fdfaa not found: ID does not exist" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.493033 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b5a9723-b287-46d6-a9c7-136a560e3e38" path="/var/lib/kubelet/pods/6b5a9723-b287-46d6-a9c7-136a560e3e38/volumes" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.734864 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 23:21:19 crc kubenswrapper[4919]: E0310 23:21:19.735332 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5a9723-b287-46d6-a9c7-136a560e3e38" containerName="init" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.735362 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5a9723-b287-46d6-a9c7-136a560e3e38" containerName="init" Mar 10 23:21:19 crc kubenswrapper[4919]: E0310 23:21:19.735385 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5a9723-b287-46d6-a9c7-136a560e3e38" containerName="dnsmasq-dns" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.735420 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5a9723-b287-46d6-a9c7-136a560e3e38" containerName="dnsmasq-dns" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.735701 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5a9723-b287-46d6-a9c7-136a560e3e38" containerName="dnsmasq-dns" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.737127 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.738789 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.738933 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.739477 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.739793 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-mc7lm" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.760343 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.822543 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gsj\" (UniqueName: \"kubernetes.io/projected/365b5cdc-86c4-4d46-b368-c12553375bce-kube-api-access-m7gsj\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.822626 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365b5cdc-86c4-4d46-b368-c12553375bce-config\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.822658 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/365b5cdc-86c4-4d46-b368-c12553375bce-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.822720 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/365b5cdc-86c4-4d46-b368-c12553375bce-scripts\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.822823 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.822857 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.822905 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.923971 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.924377 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gsj\" (UniqueName: \"kubernetes.io/projected/365b5cdc-86c4-4d46-b368-c12553375bce-kube-api-access-m7gsj\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.924452 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365b5cdc-86c4-4d46-b368-c12553375bce-config\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.924487 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/365b5cdc-86c4-4d46-b368-c12553375bce-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.924523 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/365b5cdc-86c4-4d46-b368-c12553375bce-scripts\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.924586 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.924618 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.925341 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/365b5cdc-86c4-4d46-b368-c12553375bce-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.925760 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/365b5cdc-86c4-4d46-b368-c12553375bce-scripts\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.926058 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365b5cdc-86c4-4d46-b368-c12553375bce-config\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.930156 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.931683 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.944601 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gsj\" (UniqueName: \"kubernetes.io/projected/365b5cdc-86c4-4d46-b368-c12553375bce-kube-api-access-m7gsj\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:19 crc kubenswrapper[4919]: I0310 23:21:19.945543 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/365b5cdc-86c4-4d46-b368-c12553375bce-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"365b5cdc-86c4-4d46-b368-c12553375bce\") " pod="openstack/ovn-northd-0" Mar 10 23:21:20 crc kubenswrapper[4919]: I0310 23:21:20.061095 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 23:21:20 crc kubenswrapper[4919]: I0310 23:21:20.482107 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 23:21:21 crc kubenswrapper[4919]: I0310 23:21:21.389972 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"365b5cdc-86c4-4d46-b368-c12553375bce","Type":"ContainerStarted","Data":"46168e0c0ee2fd7f10c6f9b4c41906bac15f69e950a49e2f0c7156220896ea7c"} Mar 10 23:21:21 crc kubenswrapper[4919]: I0310 23:21:21.390418 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"365b5cdc-86c4-4d46-b368-c12553375bce","Type":"ContainerStarted","Data":"f4b3349a5514c4cddd520147571e2d5233e8090d037b1e5cc899f81a1b382bcb"} Mar 10 23:21:21 crc kubenswrapper[4919]: I0310 23:21:21.390431 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"365b5cdc-86c4-4d46-b368-c12553375bce","Type":"ContainerStarted","Data":"afc883c3080d627fef344ea92547cd54cb3ca841d21e5feb73cb9d9441a5b513"} Mar 10 23:21:21 crc kubenswrapper[4919]: I0310 23:21:21.391708 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 23:21:21 crc kubenswrapper[4919]: I0310 23:21:21.418801 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.418778306 podStartE2EDuration="2.418778306s" podCreationTimestamp="2026-03-10 23:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:21.415084295 +0000 UTC m=+5468.656964903" watchObservedRunningTime="2026-03-10 23:21:21.418778306 +0000 UTC m=+5468.660658914" Mar 10 23:21:24 crc kubenswrapper[4919]: I0310 23:21:24.847798 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ck5qr"] Mar 10 23:21:24 crc kubenswrapper[4919]: I0310 23:21:24.851534 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:24 crc kubenswrapper[4919]: I0310 23:21:24.880032 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ck5qr"] Mar 10 23:21:24 crc kubenswrapper[4919]: I0310 23:21:24.947240 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-18ba-account-create-update-2wmzn"] Mar 10 23:21:24 crc kubenswrapper[4919]: I0310 23:21:24.948585 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:24 crc kubenswrapper[4919]: I0310 23:21:24.950638 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 23:21:24 crc kubenswrapper[4919]: I0310 23:21:24.956513 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-18ba-account-create-update-2wmzn"] Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.011866 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-operator-scripts\") pod \"keystone-db-create-ck5qr\" (UID: \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\") " pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.011926 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6vxl\" (UniqueName: \"kubernetes.io/projected/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-kube-api-access-n6vxl\") pod \"keystone-db-create-ck5qr\" (UID: \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\") " pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.113905 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6vxl\" (UniqueName: \"kubernetes.io/projected/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-kube-api-access-n6vxl\") pod \"keystone-db-create-ck5qr\" (UID: \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\") " pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.114047 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpjr9\" (UniqueName: \"kubernetes.io/projected/526d94b1-75fd-42aa-a1bb-829018a77826-kube-api-access-jpjr9\") pod \"keystone-18ba-account-create-update-2wmzn\" (UID: \"526d94b1-75fd-42aa-a1bb-829018a77826\") " pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.114086 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526d94b1-75fd-42aa-a1bb-829018a77826-operator-scripts\") pod \"keystone-18ba-account-create-update-2wmzn\" (UID: \"526d94b1-75fd-42aa-a1bb-829018a77826\") " pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.114142 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-operator-scripts\") pod \"keystone-db-create-ck5qr\" (UID: \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\") " pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.114914 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-operator-scripts\") pod \"keystone-db-create-ck5qr\" (UID: \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\") " pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.140074 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6vxl\" (UniqueName: \"kubernetes.io/projected/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-kube-api-access-n6vxl\") pod \"keystone-db-create-ck5qr\" (UID: \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\") " pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.185341 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.215379 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjr9\" (UniqueName: \"kubernetes.io/projected/526d94b1-75fd-42aa-a1bb-829018a77826-kube-api-access-jpjr9\") pod \"keystone-18ba-account-create-update-2wmzn\" (UID: \"526d94b1-75fd-42aa-a1bb-829018a77826\") " pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.215456 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526d94b1-75fd-42aa-a1bb-829018a77826-operator-scripts\") pod \"keystone-18ba-account-create-update-2wmzn\" (UID: \"526d94b1-75fd-42aa-a1bb-829018a77826\") " pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.216224 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526d94b1-75fd-42aa-a1bb-829018a77826-operator-scripts\") pod \"keystone-18ba-account-create-update-2wmzn\" (UID: \"526d94b1-75fd-42aa-a1bb-829018a77826\") " pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.241976 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjr9\" (UniqueName: \"kubernetes.io/projected/526d94b1-75fd-42aa-a1bb-829018a77826-kube-api-access-jpjr9\") pod \"keystone-18ba-account-create-update-2wmzn\" (UID: \"526d94b1-75fd-42aa-a1bb-829018a77826\") " pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.275767 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.698279 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ck5qr"] Mar 10 23:21:25 crc kubenswrapper[4919]: W0310 23:21:25.711047 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df0c044_9ad4_4c72_bf4c_7bd4f5ce723c.slice/crio-68ba7f1c20259c34b9671907ef9b8222ae9a0b9e5171521f30c2c443691d90a5 WatchSource:0}: Error finding container 68ba7f1c20259c34b9671907ef9b8222ae9a0b9e5171521f30c2c443691d90a5: Status 404 returned error can't find the container with id 68ba7f1c20259c34b9671907ef9b8222ae9a0b9e5171521f30c2c443691d90a5 Mar 10 23:21:25 crc kubenswrapper[4919]: I0310 23:21:25.778584 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-18ba-account-create-update-2wmzn"] Mar 10 23:21:25 crc kubenswrapper[4919]: W0310 23:21:25.785076 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod526d94b1_75fd_42aa_a1bb_829018a77826.slice/crio-9bce1473919e8ff65695c2e06bad679b8cf6e52ad703f08b2beb17c57b0f8f3b WatchSource:0}: Error finding container 9bce1473919e8ff65695c2e06bad679b8cf6e52ad703f08b2beb17c57b0f8f3b: Status 404 returned error can't find the container with id 9bce1473919e8ff65695c2e06bad679b8cf6e52ad703f08b2beb17c57b0f8f3b Mar 10 23:21:26 crc kubenswrapper[4919]: I0310 23:21:26.438002 4919 generic.go:334] "Generic (PLEG): container finished" podID="1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c" containerID="4473f8e69522b250c4e26eeb3afad1666e65f2fe01029b72390dc8bdeab9382c" exitCode=0 Mar 10 23:21:26 crc kubenswrapper[4919]: I0310 23:21:26.438072 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ck5qr" event={"ID":"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c","Type":"ContainerDied","Data":"4473f8e69522b250c4e26eeb3afad1666e65f2fe01029b72390dc8bdeab9382c"} Mar 10 23:21:26 crc kubenswrapper[4919]: I0310 23:21:26.438101 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ck5qr" event={"ID":"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c","Type":"ContainerStarted","Data":"68ba7f1c20259c34b9671907ef9b8222ae9a0b9e5171521f30c2c443691d90a5"} Mar 10 23:21:26 crc kubenswrapper[4919]: I0310 23:21:26.439840 4919 generic.go:334] "Generic (PLEG): container finished" podID="526d94b1-75fd-42aa-a1bb-829018a77826" containerID="28b04fa300293e024a7efe611602004613b6896d1434a6f92d09d617ccbeaedb" exitCode=0 Mar 10 23:21:26 crc kubenswrapper[4919]: I0310 23:21:26.439876 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-18ba-account-create-update-2wmzn" event={"ID":"526d94b1-75fd-42aa-a1bb-829018a77826","Type":"ContainerDied","Data":"28b04fa300293e024a7efe611602004613b6896d1434a6f92d09d617ccbeaedb"} Mar 10 23:21:26 crc kubenswrapper[4919]: I0310 23:21:26.439899 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-18ba-account-create-update-2wmzn" event={"ID":"526d94b1-75fd-42aa-a1bb-829018a77826","Type":"ContainerStarted","Data":"9bce1473919e8ff65695c2e06bad679b8cf6e52ad703f08b2beb17c57b0f8f3b"} Mar 10 23:21:27 crc kubenswrapper[4919]: I0310 23:21:27.936009 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:27 crc kubenswrapper[4919]: I0310 23:21:27.942231 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.058372 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6vxl\" (UniqueName: \"kubernetes.io/projected/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-kube-api-access-n6vxl\") pod \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\" (UID: \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\") " Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.058446 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpjr9\" (UniqueName: \"kubernetes.io/projected/526d94b1-75fd-42aa-a1bb-829018a77826-kube-api-access-jpjr9\") pod \"526d94b1-75fd-42aa-a1bb-829018a77826\" (UID: \"526d94b1-75fd-42aa-a1bb-829018a77826\") " Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.058608 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526d94b1-75fd-42aa-a1bb-829018a77826-operator-scripts\") pod \"526d94b1-75fd-42aa-a1bb-829018a77826\" (UID: \"526d94b1-75fd-42aa-a1bb-829018a77826\") " Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.058632 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-operator-scripts\") pod \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\" (UID: \"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c\") " Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.059522 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526d94b1-75fd-42aa-a1bb-829018a77826-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "526d94b1-75fd-42aa-a1bb-829018a77826" (UID: "526d94b1-75fd-42aa-a1bb-829018a77826"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.059543 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c" (UID: "1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.064682 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/526d94b1-75fd-42aa-a1bb-829018a77826-kube-api-access-jpjr9" (OuterVolumeSpecName: "kube-api-access-jpjr9") pod "526d94b1-75fd-42aa-a1bb-829018a77826" (UID: "526d94b1-75fd-42aa-a1bb-829018a77826"). InnerVolumeSpecName "kube-api-access-jpjr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.064930 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-kube-api-access-n6vxl" (OuterVolumeSpecName: "kube-api-access-n6vxl") pod "1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c" (UID: "1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c"). InnerVolumeSpecName "kube-api-access-n6vxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.161083 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6vxl\" (UniqueName: \"kubernetes.io/projected/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-kube-api-access-n6vxl\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.161121 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpjr9\" (UniqueName: \"kubernetes.io/projected/526d94b1-75fd-42aa-a1bb-829018a77826-kube-api-access-jpjr9\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.161131 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/526d94b1-75fd-42aa-a1bb-829018a77826-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.161140 4919 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.463187 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ck5qr" event={"ID":"1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c","Type":"ContainerDied","Data":"68ba7f1c20259c34b9671907ef9b8222ae9a0b9e5171521f30c2c443691d90a5"} Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.463241 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68ba7f1c20259c34b9671907ef9b8222ae9a0b9e5171521f30c2c443691d90a5" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.463195 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ck5qr" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.466427 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-18ba-account-create-update-2wmzn" event={"ID":"526d94b1-75fd-42aa-a1bb-829018a77826","Type":"ContainerDied","Data":"9bce1473919e8ff65695c2e06bad679b8cf6e52ad703f08b2beb17c57b0f8f3b"} Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.466471 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bce1473919e8ff65695c2e06bad679b8cf6e52ad703f08b2beb17c57b0f8f3b" Mar 10 23:21:28 crc kubenswrapper[4919]: I0310 23:21:28.466494 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-18ba-account-create-update-2wmzn" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.170294 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.466938 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-6cvb8"] Mar 10 23:21:30 crc kubenswrapper[4919]: E0310 23:21:30.467894 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526d94b1-75fd-42aa-a1bb-829018a77826" containerName="mariadb-account-create-update" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.468025 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="526d94b1-75fd-42aa-a1bb-829018a77826" containerName="mariadb-account-create-update" Mar 10 23:21:30 crc kubenswrapper[4919]: E0310 23:21:30.468164 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c" containerName="mariadb-database-create" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.468254 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c" containerName="mariadb-database-create" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.468656 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="526d94b1-75fd-42aa-a1bb-829018a77826" containerName="mariadb-account-create-update" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.468808 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c" containerName="mariadb-database-create" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.469840 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.472406 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tbx77" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.472778 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.472960 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.474228 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.480904 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6cvb8"] Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.608581 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-config-data\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.608633 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-combined-ca-bundle\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.608714 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t72jl\" (UniqueName: \"kubernetes.io/projected/7cd5ae8b-b383-4d88-945d-4b494b3e322e-kube-api-access-t72jl\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.710590 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-config-data\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.710664 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-combined-ca-bundle\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.710776 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t72jl\" (UniqueName: \"kubernetes.io/projected/7cd5ae8b-b383-4d88-945d-4b494b3e322e-kube-api-access-t72jl\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.716710 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-combined-ca-bundle\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.717056 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-config-data\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.726670 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t72jl\" (UniqueName: \"kubernetes.io/projected/7cd5ae8b-b383-4d88-945d-4b494b3e322e-kube-api-access-t72jl\") pod \"keystone-db-sync-6cvb8\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:30 crc kubenswrapper[4919]: I0310 23:21:30.820726 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:31 crc kubenswrapper[4919]: I0310 23:21:31.268583 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6cvb8"] Mar 10 23:21:31 crc kubenswrapper[4919]: W0310 23:21:31.273721 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cd5ae8b_b383_4d88_945d_4b494b3e322e.slice/crio-413d4a0316391c32a2ab6a530bcdc4bc782ddbabdc3506f8794bff77adafd81f WatchSource:0}: Error finding container 413d4a0316391c32a2ab6a530bcdc4bc782ddbabdc3506f8794bff77adafd81f: Status 404 returned error can't find the container with id 413d4a0316391c32a2ab6a530bcdc4bc782ddbabdc3506f8794bff77adafd81f Mar 10 23:21:31 crc kubenswrapper[4919]: I0310 23:21:31.502640 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6cvb8" event={"ID":"7cd5ae8b-b383-4d88-945d-4b494b3e322e","Type":"ContainerStarted","Data":"06195add5c6c87f0958d7c873603cf9e3bfc699a62ce88aa893f5c12f04a5554"} Mar 10 23:21:31 crc kubenswrapper[4919]: I0310 23:21:31.502683 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6cvb8" event={"ID":"7cd5ae8b-b383-4d88-945d-4b494b3e322e","Type":"ContainerStarted","Data":"413d4a0316391c32a2ab6a530bcdc4bc782ddbabdc3506f8794bff77adafd81f"} Mar 10 23:21:31 crc kubenswrapper[4919]: I0310 23:21:31.527107 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-6cvb8" podStartSLOduration=1.52708136 podStartE2EDuration="1.52708136s" podCreationTimestamp="2026-03-10 23:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:31.52594488 +0000 UTC m=+5478.767825528" watchObservedRunningTime="2026-03-10 23:21:31.52708136 +0000 UTC m=+5478.768961998" Mar 10 23:21:33 crc kubenswrapper[4919]: I0310 23:21:33.522709 4919 generic.go:334] "Generic (PLEG): container finished" podID="7cd5ae8b-b383-4d88-945d-4b494b3e322e" containerID="06195add5c6c87f0958d7c873603cf9e3bfc699a62ce88aa893f5c12f04a5554" exitCode=0 Mar 10 23:21:33 crc kubenswrapper[4919]: I0310 23:21:33.522777 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6cvb8" event={"ID":"7cd5ae8b-b383-4d88-945d-4b494b3e322e","Type":"ContainerDied","Data":"06195add5c6c87f0958d7c873603cf9e3bfc699a62ce88aa893f5c12f04a5554"} Mar 10 23:21:35 crc kubenswrapper[4919]: I0310 23:21:35.930594 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.012916 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t72jl\" (UniqueName: \"kubernetes.io/projected/7cd5ae8b-b383-4d88-945d-4b494b3e322e-kube-api-access-t72jl\") pod \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.013116 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-config-data\") pod \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.013165 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-combined-ca-bundle\") pod \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\" (UID: \"7cd5ae8b-b383-4d88-945d-4b494b3e322e\") " Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.020207 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd5ae8b-b383-4d88-945d-4b494b3e322e-kube-api-access-t72jl" (OuterVolumeSpecName: "kube-api-access-t72jl") pod "7cd5ae8b-b383-4d88-945d-4b494b3e322e" (UID: "7cd5ae8b-b383-4d88-945d-4b494b3e322e"). InnerVolumeSpecName "kube-api-access-t72jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.036331 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cd5ae8b-b383-4d88-945d-4b494b3e322e" (UID: "7cd5ae8b-b383-4d88-945d-4b494b3e322e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.075277 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-config-data" (OuterVolumeSpecName: "config-data") pod "7cd5ae8b-b383-4d88-945d-4b494b3e322e" (UID: "7cd5ae8b-b383-4d88-945d-4b494b3e322e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.115849 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.115900 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd5ae8b-b383-4d88-945d-4b494b3e322e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.115917 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t72jl\" (UniqueName: \"kubernetes.io/projected/7cd5ae8b-b383-4d88-945d-4b494b3e322e-kube-api-access-t72jl\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.554489 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6cvb8" event={"ID":"7cd5ae8b-b383-4d88-945d-4b494b3e322e","Type":"ContainerDied","Data":"413d4a0316391c32a2ab6a530bcdc4bc782ddbabdc3506f8794bff77adafd81f"} Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.554548 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="413d4a0316391c32a2ab6a530bcdc4bc782ddbabdc3506f8794bff77adafd81f" Mar 10 23:21:36 crc kubenswrapper[4919]: I0310 23:21:36.554558 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6cvb8" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.201566 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5956757-c9nr5"] Mar 10 23:21:37 crc kubenswrapper[4919]: E0310 23:21:37.211064 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd5ae8b-b383-4d88-945d-4b494b3e322e" containerName="keystone-db-sync" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.211110 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd5ae8b-b383-4d88-945d-4b494b3e322e" containerName="keystone-db-sync" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.229479 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd5ae8b-b383-4d88-945d-4b494b3e322e" containerName="keystone-db-sync" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.233642 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.236312 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5956757-c9nr5"] Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.252598 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4hjzp"] Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.260867 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.262367 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4hjzp"] Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.264837 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.265227 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.265530 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tbx77" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.265775 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.266050 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.337748 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-scripts\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.337798 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-fernet-keys\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.337842 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-dns-svc\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.337870 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-ovsdbserver-sb\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.337934 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-ovsdbserver-nb\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.337977 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-config\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.338108 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-config-data\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.338210 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8k82\" (UniqueName: \"kubernetes.io/projected/1de63e5e-28f2-49b3-95b0-3400b96262e8-kube-api-access-p8k82\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.338285 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-combined-ca-bundle\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.338367 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz5rr\" (UniqueName: \"kubernetes.io/projected/0daed46b-e851-4d26-9867-827a7973aece-kube-api-access-gz5rr\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.338436 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-credential-keys\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.439843 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz5rr\" (UniqueName: \"kubernetes.io/projected/0daed46b-e851-4d26-9867-827a7973aece-kube-api-access-gz5rr\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440253 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-credential-keys\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440368 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-scripts\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440442 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-fernet-keys\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440491 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-dns-svc\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440516 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-ovsdbserver-sb\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440539 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-ovsdbserver-nb\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440564 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-config\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440595 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-config-data\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440634 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8k82\" (UniqueName: \"kubernetes.io/projected/1de63e5e-28f2-49b3-95b0-3400b96262e8-kube-api-access-p8k82\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.440664 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-combined-ca-bundle\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.441950 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-ovsdbserver-nb\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.442147 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-ovsdbserver-sb\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.442656 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-dns-svc\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.443007 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0daed46b-e851-4d26-9867-827a7973aece-config\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.446037 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-combined-ca-bundle\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.446547 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-config-data\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.447331 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-fernet-keys\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.448803 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-credential-keys\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.459165 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-scripts\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.461593 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz5rr\" (UniqueName: \"kubernetes.io/projected/0daed46b-e851-4d26-9867-827a7973aece-kube-api-access-gz5rr\") pod \"dnsmasq-dns-66d5956757-c9nr5\" (UID: \"0daed46b-e851-4d26-9867-827a7973aece\") " pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.466877 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8k82\" (UniqueName: \"kubernetes.io/projected/1de63e5e-28f2-49b3-95b0-3400b96262e8-kube-api-access-p8k82\") pod \"keystone-bootstrap-4hjzp\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.563933 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:37 crc kubenswrapper[4919]: I0310 23:21:37.583868 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:38 crc kubenswrapper[4919]: W0310 23:21:38.069749 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0daed46b_e851_4d26_9867_827a7973aece.slice/crio-d71be2cb51ac99ef3ada12393ebeb9082027040c3f9e9f55b00fbd806ad659d7 WatchSource:0}: Error finding container d71be2cb51ac99ef3ada12393ebeb9082027040c3f9e9f55b00fbd806ad659d7: Status 404 returned error can't find the container with id d71be2cb51ac99ef3ada12393ebeb9082027040c3f9e9f55b00fbd806ad659d7 Mar 10 23:21:38 crc kubenswrapper[4919]: I0310 23:21:38.080709 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5956757-c9nr5"] Mar 10 23:21:38 crc kubenswrapper[4919]: I0310 23:21:38.169052 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4hjzp"] Mar 10 23:21:38 crc kubenswrapper[4919]: I0310 23:21:38.573548 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4hjzp" event={"ID":"1de63e5e-28f2-49b3-95b0-3400b96262e8","Type":"ContainerStarted","Data":"e6b38706ddc51ac06fd68b55fcbd250a38f1d602f1ea64638ebc97f024f4f9ee"} Mar 10 23:21:38 crc kubenswrapper[4919]: I0310 23:21:38.573841 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4hjzp" event={"ID":"1de63e5e-28f2-49b3-95b0-3400b96262e8","Type":"ContainerStarted","Data":"4faf7f6c54173e2910e1fd2f87dd4ed780fb32fe14d63b5d383bc1e0cda7f32c"} Mar 10 23:21:38 crc kubenswrapper[4919]: I0310 23:21:38.575724 4919 generic.go:334] "Generic (PLEG): container finished" podID="0daed46b-e851-4d26-9867-827a7973aece" containerID="4156b77a6f4428debb77fbc3813f211736a5d1cb0438b2ca14f97e0acc3522c6" exitCode=0 Mar 10 23:21:38 crc kubenswrapper[4919]: I0310 23:21:38.575797 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5956757-c9nr5" event={"ID":"0daed46b-e851-4d26-9867-827a7973aece","Type":"ContainerDied","Data":"4156b77a6f4428debb77fbc3813f211736a5d1cb0438b2ca14f97e0acc3522c6"} Mar 10 23:21:38 crc kubenswrapper[4919]: I0310 23:21:38.575832 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5956757-c9nr5" event={"ID":"0daed46b-e851-4d26-9867-827a7973aece","Type":"ContainerStarted","Data":"d71be2cb51ac99ef3ada12393ebeb9082027040c3f9e9f55b00fbd806ad659d7"} Mar 10 23:21:38 crc kubenswrapper[4919]: I0310 23:21:38.598534 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4hjzp" podStartSLOduration=1.598515334 podStartE2EDuration="1.598515334s" podCreationTimestamp="2026-03-10 23:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:38.592238323 +0000 UTC m=+5485.834118931" watchObservedRunningTime="2026-03-10 23:21:38.598515334 +0000 UTC m=+5485.840395942" Mar 10 23:21:39 crc kubenswrapper[4919]: I0310 23:21:39.586075 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5956757-c9nr5" event={"ID":"0daed46b-e851-4d26-9867-827a7973aece","Type":"ContainerStarted","Data":"30114ca0cfadf33ae7aad0d79551ec1bd3efe0ec1c1e2f4bd7409647124cf169"} Mar 10 23:21:39 crc kubenswrapper[4919]: I0310 23:21:39.586408 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:39 crc kubenswrapper[4919]: I0310 23:21:39.604960 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5956757-c9nr5" podStartSLOduration=2.604939459 podStartE2EDuration="2.604939459s" podCreationTimestamp="2026-03-10 23:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:39.604535887 +0000 UTC m=+5486.846416495" watchObservedRunningTime="2026-03-10 23:21:39.604939459 +0000 UTC m=+5486.846820067" Mar 10 23:21:42 crc kubenswrapper[4919]: I0310 23:21:42.607362 4919 generic.go:334] "Generic (PLEG): container finished" podID="1de63e5e-28f2-49b3-95b0-3400b96262e8" containerID="e6b38706ddc51ac06fd68b55fcbd250a38f1d602f1ea64638ebc97f024f4f9ee" exitCode=0 Mar 10 23:21:42 crc kubenswrapper[4919]: I0310 23:21:42.607463 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4hjzp" event={"ID":"1de63e5e-28f2-49b3-95b0-3400b96262e8","Type":"ContainerDied","Data":"e6b38706ddc51ac06fd68b55fcbd250a38f1d602f1ea64638ebc97f024f4f9ee"} Mar 10 23:21:43 crc kubenswrapper[4919]: I0310 23:21:43.964401 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.062123 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-combined-ca-bundle\") pod \"1de63e5e-28f2-49b3-95b0-3400b96262e8\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.062220 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8k82\" (UniqueName: \"kubernetes.io/projected/1de63e5e-28f2-49b3-95b0-3400b96262e8-kube-api-access-p8k82\") pod \"1de63e5e-28f2-49b3-95b0-3400b96262e8\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.062258 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-credential-keys\") pod \"1de63e5e-28f2-49b3-95b0-3400b96262e8\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.062274 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-config-data\") pod \"1de63e5e-28f2-49b3-95b0-3400b96262e8\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.062301 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-scripts\") pod \"1de63e5e-28f2-49b3-95b0-3400b96262e8\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.062348 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-fernet-keys\") pod \"1de63e5e-28f2-49b3-95b0-3400b96262e8\" (UID: \"1de63e5e-28f2-49b3-95b0-3400b96262e8\") " Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.067420 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1de63e5e-28f2-49b3-95b0-3400b96262e8" (UID: "1de63e5e-28f2-49b3-95b0-3400b96262e8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.067768 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1de63e5e-28f2-49b3-95b0-3400b96262e8" (UID: "1de63e5e-28f2-49b3-95b0-3400b96262e8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.067786 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de63e5e-28f2-49b3-95b0-3400b96262e8-kube-api-access-p8k82" (OuterVolumeSpecName: "kube-api-access-p8k82") pod "1de63e5e-28f2-49b3-95b0-3400b96262e8" (UID: "1de63e5e-28f2-49b3-95b0-3400b96262e8"). InnerVolumeSpecName "kube-api-access-p8k82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.068483 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-scripts" (OuterVolumeSpecName: "scripts") pod "1de63e5e-28f2-49b3-95b0-3400b96262e8" (UID: "1de63e5e-28f2-49b3-95b0-3400b96262e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.084307 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-config-data" (OuterVolumeSpecName: "config-data") pod "1de63e5e-28f2-49b3-95b0-3400b96262e8" (UID: "1de63e5e-28f2-49b3-95b0-3400b96262e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.084794 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1de63e5e-28f2-49b3-95b0-3400b96262e8" (UID: "1de63e5e-28f2-49b3-95b0-3400b96262e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.164792 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.164829 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8k82\" (UniqueName: \"kubernetes.io/projected/1de63e5e-28f2-49b3-95b0-3400b96262e8-kube-api-access-p8k82\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.164842 4919 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.164854 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.164864 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.164874 4919 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1de63e5e-28f2-49b3-95b0-3400b96262e8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.625468 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4hjzp" event={"ID":"1de63e5e-28f2-49b3-95b0-3400b96262e8","Type":"ContainerDied","Data":"4faf7f6c54173e2910e1fd2f87dd4ed780fb32fe14d63b5d383bc1e0cda7f32c"} Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.625509 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4faf7f6c54173e2910e1fd2f87dd4ed780fb32fe14d63b5d383bc1e0cda7f32c" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.625557 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4hjzp" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.712154 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4hjzp"] Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.718218 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4hjzp"] Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.825904 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vgm8x"] Mar 10 23:21:44 crc kubenswrapper[4919]: E0310 23:21:44.826230 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de63e5e-28f2-49b3-95b0-3400b96262e8" containerName="keystone-bootstrap" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.826249 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de63e5e-28f2-49b3-95b0-3400b96262e8" containerName="keystone-bootstrap" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.826503 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de63e5e-28f2-49b3-95b0-3400b96262e8" containerName="keystone-bootstrap" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.827049 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.830471 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.830600 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.830477 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tbx77" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.830788 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.832766 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.845139 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vgm8x"] Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.881370 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-combined-ca-bundle\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.881668 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-config-data\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.881689 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-scripts\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.881746 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-fernet-keys\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.881766 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-credential-keys\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.881855 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv76h\" (UniqueName: \"kubernetes.io/projected/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-kube-api-access-cv76h\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.983385 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv76h\" (UniqueName: \"kubernetes.io/projected/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-kube-api-access-cv76h\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.983480 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-combined-ca-bundle\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.983573 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-scripts\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.983596 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-config-data\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.983719 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-fernet-keys\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.983741 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-credential-keys\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.988447 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-fernet-keys\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.988622 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-combined-ca-bundle\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.988965 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-credential-keys\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.991332 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-scripts\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:44 crc kubenswrapper[4919]: I0310 23:21:44.996807 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-config-data\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:45 crc kubenswrapper[4919]: I0310 23:21:45.002134 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv76h\" (UniqueName: \"kubernetes.io/projected/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-kube-api-access-cv76h\") pod \"keystone-bootstrap-vgm8x\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:45 crc kubenswrapper[4919]: I0310 23:21:45.190487 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:45 crc kubenswrapper[4919]: I0310 23:21:45.488609 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de63e5e-28f2-49b3-95b0-3400b96262e8" path="/var/lib/kubelet/pods/1de63e5e-28f2-49b3-95b0-3400b96262e8/volumes" Mar 10 23:21:45 crc kubenswrapper[4919]: I0310 23:21:45.663740 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vgm8x"] Mar 10 23:21:45 crc kubenswrapper[4919]: W0310 23:21:45.669811 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1e6096_3ede_4a8d_a82c_6c919dafb2d8.slice/crio-344e73354637dcc8ce8cac76118e39d369c4c09ec9e3358b20331bbb5310c07b WatchSource:0}: Error finding container 344e73354637dcc8ce8cac76118e39d369c4c09ec9e3358b20331bbb5310c07b: Status 404 returned error can't find the container with id 344e73354637dcc8ce8cac76118e39d369c4c09ec9e3358b20331bbb5310c07b Mar 10 23:21:46 crc kubenswrapper[4919]: I0310 23:21:46.642617 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgm8x" event={"ID":"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8","Type":"ContainerStarted","Data":"aa1022fa595f8ec918291d8c569411d8edd6ca76b2a06abc402db412276c4b86"} Mar 10 23:21:46 crc kubenswrapper[4919]: I0310 23:21:46.642931 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgm8x" event={"ID":"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8","Type":"ContainerStarted","Data":"344e73354637dcc8ce8cac76118e39d369c4c09ec9e3358b20331bbb5310c07b"} Mar 10 23:21:46 crc kubenswrapper[4919]: I0310 23:21:46.667553 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vgm8x" podStartSLOduration=2.667512789 podStartE2EDuration="2.667512789s" podCreationTimestamp="2026-03-10 23:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:46.664663851 +0000 UTC m=+5493.906544489" watchObservedRunningTime="2026-03-10 23:21:46.667512789 +0000 UTC m=+5493.909393417" Mar 10 23:21:47 crc kubenswrapper[4919]: I0310 23:21:47.174942 4919 scope.go:117] "RemoveContainer" containerID="b979cf7db5c663c1ded688e8317bb54c8ece3d27bce5d23f0cee13bd4b69c88a" Mar 10 23:21:47 crc kubenswrapper[4919]: I0310 23:21:47.565924 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5956757-c9nr5" Mar 10 23:21:47 crc kubenswrapper[4919]: I0310 23:21:47.648685 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68577db887-c75ws"] Mar 10 23:21:47 crc kubenswrapper[4919]: I0310 23:21:47.648997 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68577db887-c75ws" podUID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" containerName="dnsmasq-dns" containerID="cri-o://89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef" gracePeriod=10 Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.132680 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.242946 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhqkl\" (UniqueName: \"kubernetes.io/projected/ec560d1e-297a-4ffb-bb92-8a5128c709a9-kube-api-access-lhqkl\") pod \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.243000 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-nb\") pod \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.243065 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-sb\") pod \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.243126 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-dns-svc\") pod \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.243218 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-config\") pod \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\" (UID: \"ec560d1e-297a-4ffb-bb92-8a5128c709a9\") " Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.267517 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec560d1e-297a-4ffb-bb92-8a5128c709a9-kube-api-access-lhqkl" (OuterVolumeSpecName: "kube-api-access-lhqkl") pod "ec560d1e-297a-4ffb-bb92-8a5128c709a9" (UID: "ec560d1e-297a-4ffb-bb92-8a5128c709a9"). InnerVolumeSpecName "kube-api-access-lhqkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.285795 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec560d1e-297a-4ffb-bb92-8a5128c709a9" (UID: "ec560d1e-297a-4ffb-bb92-8a5128c709a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.288093 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec560d1e-297a-4ffb-bb92-8a5128c709a9" (UID: "ec560d1e-297a-4ffb-bb92-8a5128c709a9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.292712 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec560d1e-297a-4ffb-bb92-8a5128c709a9" (UID: "ec560d1e-297a-4ffb-bb92-8a5128c709a9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.294492 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-config" (OuterVolumeSpecName: "config") pod "ec560d1e-297a-4ffb-bb92-8a5128c709a9" (UID: "ec560d1e-297a-4ffb-bb92-8a5128c709a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.344908 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.344958 4919 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.344969 4919 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.344978 4919 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec560d1e-297a-4ffb-bb92-8a5128c709a9-config\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.344988 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhqkl\" (UniqueName: \"kubernetes.io/projected/ec560d1e-297a-4ffb-bb92-8a5128c709a9-kube-api-access-lhqkl\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.678107 4919 generic.go:334] "Generic (PLEG): container finished" podID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" containerID="89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef" exitCode=0 Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.678152 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-c75ws" event={"ID":"ec560d1e-297a-4ffb-bb92-8a5128c709a9","Type":"ContainerDied","Data":"89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef"} Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.678177 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-c75ws" event={"ID":"ec560d1e-297a-4ffb-bb92-8a5128c709a9","Type":"ContainerDied","Data":"147cdac924d2bd72e3bd1eaad62c0ffa3903f8b2ee9e4bb07bcdc67548990957"} Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.678193 4919 scope.go:117] "RemoveContainer" containerID="89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.680093 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68577db887-c75ws" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.704086 4919 scope.go:117] "RemoveContainer" containerID="a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.777210 4919 scope.go:117] "RemoveContainer" containerID="89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef" Mar 10 23:21:48 crc kubenswrapper[4919]: E0310 23:21:48.777655 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef\": container with ID starting with 89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef not found: ID does not exist" containerID="89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.777729 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef"} err="failed to get container status \"89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef\": rpc error: code = NotFound desc = could not find container \"89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef\": container with ID starting with 89c57195f9c894baa93b0d47af1113143abf4d74d949824d51c839e05a2492ef not found: ID does not exist" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.777779 4919 scope.go:117] "RemoveContainer" containerID="a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755" Mar 10 23:21:48 crc kubenswrapper[4919]: E0310 23:21:48.778223 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755\": container with ID starting with a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755 not found: ID does not exist" containerID="a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.778299 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755"} err="failed to get container status \"a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755\": rpc error: code = NotFound desc = could not find container \"a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755\": container with ID starting with a10dc40e205e001d66c066b85347f64dcf7fa171f240b40e4ac5191dda2bb755 not found: ID does not exist" Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.780887 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68577db887-c75ws"] Mar 10 23:21:48 crc kubenswrapper[4919]: I0310 23:21:48.793710 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68577db887-c75ws"] Mar 10 23:21:49 crc kubenswrapper[4919]: I0310 23:21:49.502089 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" path="/var/lib/kubelet/pods/ec560d1e-297a-4ffb-bb92-8a5128c709a9/volumes" Mar 10 23:21:49 crc kubenswrapper[4919]: I0310 23:21:49.691438 4919 generic.go:334] "Generic (PLEG): container finished" podID="dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" containerID="aa1022fa595f8ec918291d8c569411d8edd6ca76b2a06abc402db412276c4b86" exitCode=0 Mar 10 23:21:49 crc kubenswrapper[4919]: I0310 23:21:49.691538 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgm8x" event={"ID":"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8","Type":"ContainerDied","Data":"aa1022fa595f8ec918291d8c569411d8edd6ca76b2a06abc402db412276c4b86"} Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.078864 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.199229 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-scripts\") pod \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.199319 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-combined-ca-bundle\") pod \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.199353 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-fernet-keys\") pod \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.199507 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-credential-keys\") pod \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.199544 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-config-data\") pod \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.199565 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv76h\" (UniqueName: \"kubernetes.io/projected/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-kube-api-access-cv76h\") pod \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\" (UID: \"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8\") " Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.204316 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-scripts" (OuterVolumeSpecName: "scripts") pod "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" (UID: "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.204490 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-kube-api-access-cv76h" (OuterVolumeSpecName: "kube-api-access-cv76h") pod "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" (UID: "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8"). InnerVolumeSpecName "kube-api-access-cv76h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.204545 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" (UID: "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.205641 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" (UID: "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.225981 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-config-data" (OuterVolumeSpecName: "config-data") pod "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" (UID: "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.234727 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" (UID: "dd1e6096-3ede-4a8d-a82c-6c919dafb2d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.301623 4919 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.301674 4919 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.301685 4919 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.301693 4919 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.301702 4919 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.301711 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv76h\" (UniqueName: \"kubernetes.io/projected/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8-kube-api-access-cv76h\") on node \"crc\" DevicePath \"\"" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.711286 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vgm8x" event={"ID":"dd1e6096-3ede-4a8d-a82c-6c919dafb2d8","Type":"ContainerDied","Data":"344e73354637dcc8ce8cac76118e39d369c4c09ec9e3358b20331bbb5310c07b"} Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.711328 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="344e73354637dcc8ce8cac76118e39d369c4c09ec9e3358b20331bbb5310c07b" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.711362 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vgm8x" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.797920 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7c8596f879-gt5t2"] Mar 10 23:21:51 crc kubenswrapper[4919]: E0310 23:21:51.798305 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" containerName="dnsmasq-dns" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.798322 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" containerName="dnsmasq-dns" Mar 10 23:21:51 crc kubenswrapper[4919]: E0310 23:21:51.798341 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" containerName="init" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.798348 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" containerName="init" Mar 10 23:21:51 crc kubenswrapper[4919]: E0310 23:21:51.798368 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" containerName="keystone-bootstrap" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.798378 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" containerName="keystone-bootstrap" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.798591 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" containerName="keystone-bootstrap" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.798616 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec560d1e-297a-4ffb-bb92-8a5128c709a9" containerName="dnsmasq-dns" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.799353 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.802067 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.802668 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.802828 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tbx77" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.806913 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.806923 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.808842 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.809538 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c8596f879-gt5t2"] Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.918332 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-scripts\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.918725 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j68x\" (UniqueName: \"kubernetes.io/projected/b314d88f-85bc-49c0-9090-8f59e1f16982-kube-api-access-8j68x\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.918793 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-internal-tls-certs\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.918875 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-fernet-keys\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.919002 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-public-tls-certs\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.919036 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-combined-ca-bundle\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.919060 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-config-data\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:51 crc kubenswrapper[4919]: I0310 23:21:51.919111 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-credential-keys\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.020153 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-public-tls-certs\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.020196 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-combined-ca-bundle\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.020221 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-config-data\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.020249 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-credential-keys\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.020281 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-scripts\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.020318 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j68x\" (UniqueName: \"kubernetes.io/projected/b314d88f-85bc-49c0-9090-8f59e1f16982-kube-api-access-8j68x\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.020346 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-internal-tls-certs\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.020399 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-fernet-keys\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.024494 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-combined-ca-bundle\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.024522 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-internal-tls-certs\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.026335 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-config-data\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.026594 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-fernet-keys\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.026736 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-credential-keys\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.026965 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-scripts\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.027362 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b314d88f-85bc-49c0-9090-8f59e1f16982-public-tls-certs\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.041070 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j68x\" (UniqueName: \"kubernetes.io/projected/b314d88f-85bc-49c0-9090-8f59e1f16982-kube-api-access-8j68x\") pod \"keystone-7c8596f879-gt5t2\" (UID: \"b314d88f-85bc-49c0-9090-8f59e1f16982\") " pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.118132 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.552029 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c8596f879-gt5t2"] Mar 10 23:21:52 crc kubenswrapper[4919]: I0310 23:21:52.719475 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c8596f879-gt5t2" event={"ID":"b314d88f-85bc-49c0-9090-8f59e1f16982","Type":"ContainerStarted","Data":"65b9437879e80cc9a98469720d2b1d2fb5965636032bb10fd2db9bd09fc53e9c"} Mar 10 23:21:53 crc kubenswrapper[4919]: I0310 23:21:53.730946 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c8596f879-gt5t2" event={"ID":"b314d88f-85bc-49c0-9090-8f59e1f16982","Type":"ContainerStarted","Data":"0131f2c6e9902346d352b3b8e5ba6584fbcd43f7dac366bbae0c26cf64514d4a"} Mar 10 23:21:53 crc kubenswrapper[4919]: I0310 23:21:53.731378 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:21:53 crc kubenswrapper[4919]: I0310 23:21:53.765248 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7c8596f879-gt5t2" podStartSLOduration=2.765217067 podStartE2EDuration="2.765217067s" podCreationTimestamp="2026-03-10 23:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:21:53.754034192 +0000 UTC m=+5500.995914900" watchObservedRunningTime="2026-03-10 23:21:53.765217067 +0000 UTC m=+5501.007097715" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.145305 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553082-w8szq"] Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.147659 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553082-w8szq" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.150295 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.150477 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.155701 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.158188 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553082-w8szq"] Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.291115 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxkdx\" (UniqueName: \"kubernetes.io/projected/5887dac0-0c52-4936-8d4d-5616781311c8-kube-api-access-fxkdx\") pod \"auto-csr-approver-29553082-w8szq\" (UID: \"5887dac0-0c52-4936-8d4d-5616781311c8\") " pod="openshift-infra/auto-csr-approver-29553082-w8szq" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.392879 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxkdx\" (UniqueName: \"kubernetes.io/projected/5887dac0-0c52-4936-8d4d-5616781311c8-kube-api-access-fxkdx\") pod \"auto-csr-approver-29553082-w8szq\" (UID: \"5887dac0-0c52-4936-8d4d-5616781311c8\") " pod="openshift-infra/auto-csr-approver-29553082-w8szq" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.418962 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxkdx\" (UniqueName: \"kubernetes.io/projected/5887dac0-0c52-4936-8d4d-5616781311c8-kube-api-access-fxkdx\") pod \"auto-csr-approver-29553082-w8szq\" (UID: \"5887dac0-0c52-4936-8d4d-5616781311c8\") " pod="openshift-infra/auto-csr-approver-29553082-w8szq" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.489358 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553082-w8szq" Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.787462 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553082-w8szq"] Mar 10 23:22:00 crc kubenswrapper[4919]: W0310 23:22:00.793139 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5887dac0_0c52_4936_8d4d_5616781311c8.slice/crio-fd9a98db45933535b99159f611c50e6a1b5eede6292a8edf8cd7270fc108f189 WatchSource:0}: Error finding container fd9a98db45933535b99159f611c50e6a1b5eede6292a8edf8cd7270fc108f189: Status 404 returned error can't find the container with id fd9a98db45933535b99159f611c50e6a1b5eede6292a8edf8cd7270fc108f189 Mar 10 23:22:00 crc kubenswrapper[4919]: I0310 23:22:00.805269 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553082-w8szq" event={"ID":"5887dac0-0c52-4936-8d4d-5616781311c8","Type":"ContainerStarted","Data":"fd9a98db45933535b99159f611c50e6a1b5eede6292a8edf8cd7270fc108f189"} Mar 10 23:22:02 crc kubenswrapper[4919]: I0310 23:22:02.826378 4919 generic.go:334] "Generic (PLEG): container finished" podID="5887dac0-0c52-4936-8d4d-5616781311c8" containerID="d0ad6fa2b2d31bdf18fcff5ca063537748763fa764ab5f9529fff19af1fad576" exitCode=0 Mar 10 23:22:02 crc kubenswrapper[4919]: I0310 23:22:02.826436 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553082-w8szq" event={"ID":"5887dac0-0c52-4936-8d4d-5616781311c8","Type":"ContainerDied","Data":"d0ad6fa2b2d31bdf18fcff5ca063537748763fa764ab5f9529fff19af1fad576"} Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.136385 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pl5k6"] Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.139274 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.142836 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pl5k6"] Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.262346 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-utilities\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.262432 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-catalog-content\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.262518 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7khzl\" (UniqueName: \"kubernetes.io/projected/d1190774-892e-40dd-a15b-b5c7e5c20ac5-kube-api-access-7khzl\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.288955 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553082-w8szq" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.363329 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxkdx\" (UniqueName: \"kubernetes.io/projected/5887dac0-0c52-4936-8d4d-5616781311c8-kube-api-access-fxkdx\") pod \"5887dac0-0c52-4936-8d4d-5616781311c8\" (UID: \"5887dac0-0c52-4936-8d4d-5616781311c8\") " Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.363830 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-utilities\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.363857 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-catalog-content\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.363883 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7khzl\" (UniqueName: \"kubernetes.io/projected/d1190774-892e-40dd-a15b-b5c7e5c20ac5-kube-api-access-7khzl\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.364439 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-utilities\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.364525 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-catalog-content\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.369676 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5887dac0-0c52-4936-8d4d-5616781311c8-kube-api-access-fxkdx" (OuterVolumeSpecName: "kube-api-access-fxkdx") pod "5887dac0-0c52-4936-8d4d-5616781311c8" (UID: "5887dac0-0c52-4936-8d4d-5616781311c8"). InnerVolumeSpecName "kube-api-access-fxkdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.382418 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7khzl\" (UniqueName: \"kubernetes.io/projected/d1190774-892e-40dd-a15b-b5c7e5c20ac5-kube-api-access-7khzl\") pod \"community-operators-pl5k6\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.465111 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxkdx\" (UniqueName: \"kubernetes.io/projected/5887dac0-0c52-4936-8d4d-5616781311c8-kube-api-access-fxkdx\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.482452 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.846640 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553082-w8szq" event={"ID":"5887dac0-0c52-4936-8d4d-5616781311c8","Type":"ContainerDied","Data":"fd9a98db45933535b99159f611c50e6a1b5eede6292a8edf8cd7270fc108f189"} Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.846897 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd9a98db45933535b99159f611c50e6a1b5eede6292a8edf8cd7270fc108f189" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.846692 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553082-w8szq" Mar 10 23:22:04 crc kubenswrapper[4919]: I0310 23:22:04.956415 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pl5k6"] Mar 10 23:22:04 crc kubenswrapper[4919]: W0310 23:22:04.961757 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1190774_892e_40dd_a15b_b5c7e5c20ac5.slice/crio-2a7ddc06975961501ea4dd165c3d434c3ea7fbdff63e08092960f2ddef769563 WatchSource:0}: Error finding container 2a7ddc06975961501ea4dd165c3d434c3ea7fbdff63e08092960f2ddef769563: Status 404 returned error can't find the container with id 2a7ddc06975961501ea4dd165c3d434c3ea7fbdff63e08092960f2ddef769563 Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.349249 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553076-zgksb"] Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.354917 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553076-zgksb"] Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.498744 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b20f686-1c1a-43cb-af86-20a0670592b9" path="/var/lib/kubelet/pods/1b20f686-1c1a-43cb-af86-20a0670592b9/volumes" Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.858256 4919 generic.go:334] "Generic (PLEG): container finished" podID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerID="2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16" exitCode=0 Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.858322 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pl5k6" event={"ID":"d1190774-892e-40dd-a15b-b5c7e5c20ac5","Type":"ContainerDied","Data":"2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16"} Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.858373 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pl5k6" event={"ID":"d1190774-892e-40dd-a15b-b5c7e5c20ac5","Type":"ContainerStarted","Data":"2a7ddc06975961501ea4dd165c3d434c3ea7fbdff63e08092960f2ddef769563"} Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.928255 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sgd9s"] Mar 10 23:22:05 crc kubenswrapper[4919]: E0310 23:22:05.928670 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5887dac0-0c52-4936-8d4d-5616781311c8" containerName="oc" Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.928691 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="5887dac0-0c52-4936-8d4d-5616781311c8" containerName="oc" Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.929857 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="5887dac0-0c52-4936-8d4d-5616781311c8" containerName="oc" Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.931849 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:05 crc kubenswrapper[4919]: I0310 23:22:05.947853 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgd9s"] Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.093800 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-catalog-content\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.094214 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-utilities\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.094310 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj52x\" (UniqueName: \"kubernetes.io/projected/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-kube-api-access-rj52x\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.195862 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-utilities\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.195990 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj52x\" (UniqueName: \"kubernetes.io/projected/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-kube-api-access-rj52x\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.196500 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-utilities\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.196552 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-catalog-content\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.196589 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-catalog-content\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.219305 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj52x\" (UniqueName: \"kubernetes.io/projected/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-kube-api-access-rj52x\") pod \"redhat-marketplace-sgd9s\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.279441 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.754103 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgd9s"] Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.873188 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pl5k6" event={"ID":"d1190774-892e-40dd-a15b-b5c7e5c20ac5","Type":"ContainerStarted","Data":"f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7"} Mar 10 23:22:06 crc kubenswrapper[4919]: I0310 23:22:06.875827 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgd9s" event={"ID":"3865f2ad-9f56-4d05-a6ef-2bc21bb20548","Type":"ContainerStarted","Data":"c08c46c96e55b347ecd5b7b53720757b511b222aa01437bbc686649fb5cf0846"} Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.320714 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s9czm"] Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.322281 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.345307 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9czm"] Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.418698 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f6ee806-3545-424a-8b52-3116d438d035-utilities\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.418775 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f6ee806-3545-424a-8b52-3116d438d035-catalog-content\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.418877 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twdqh\" (UniqueName: \"kubernetes.io/projected/2f6ee806-3545-424a-8b52-3116d438d035-kube-api-access-twdqh\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.520137 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f6ee806-3545-424a-8b52-3116d438d035-utilities\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.520263 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f6ee806-3545-424a-8b52-3116d438d035-catalog-content\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.520356 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twdqh\" (UniqueName: \"kubernetes.io/projected/2f6ee806-3545-424a-8b52-3116d438d035-kube-api-access-twdqh\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.520924 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f6ee806-3545-424a-8b52-3116d438d035-catalog-content\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.521285 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f6ee806-3545-424a-8b52-3116d438d035-utilities\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.541436 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twdqh\" (UniqueName: \"kubernetes.io/projected/2f6ee806-3545-424a-8b52-3116d438d035-kube-api-access-twdqh\") pod \"certified-operators-s9czm\" (UID: \"2f6ee806-3545-424a-8b52-3116d438d035\") " pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.641462 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.889547 4919 generic.go:334] "Generic (PLEG): container finished" podID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerID="f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7" exitCode=0 Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.889613 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pl5k6" event={"ID":"d1190774-892e-40dd-a15b-b5c7e5c20ac5","Type":"ContainerDied","Data":"f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7"} Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.899829 4919 generic.go:334] "Generic (PLEG): container finished" podID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerID="900a548b3240ff7d1bc101399522ec8600966f523bb232ec6738ac4702705f35" exitCode=0 Mar 10 23:22:07 crc kubenswrapper[4919]: I0310 23:22:07.899889 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgd9s" event={"ID":"3865f2ad-9f56-4d05-a6ef-2bc21bb20548","Type":"ContainerDied","Data":"900a548b3240ff7d1bc101399522ec8600966f523bb232ec6738ac4702705f35"} Mar 10 23:22:08 crc kubenswrapper[4919]: I0310 23:22:08.109740 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9czm"] Mar 10 23:22:08 crc kubenswrapper[4919]: W0310 23:22:08.113216 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f6ee806_3545_424a_8b52_3116d438d035.slice/crio-ff7271f4d2baad492065a1f902f9097351fd463b4be501f7ca4fd57b740b6326 WatchSource:0}: Error finding container ff7271f4d2baad492065a1f902f9097351fd463b4be501f7ca4fd57b740b6326: Status 404 returned error can't find the container with id ff7271f4d2baad492065a1f902f9097351fd463b4be501f7ca4fd57b740b6326 Mar 10 23:22:08 crc kubenswrapper[4919]: I0310 23:22:08.910940 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pl5k6" event={"ID":"d1190774-892e-40dd-a15b-b5c7e5c20ac5","Type":"ContainerStarted","Data":"016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd"} Mar 10 23:22:08 crc kubenswrapper[4919]: I0310 23:22:08.912978 4919 generic.go:334] "Generic (PLEG): container finished" podID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerID="0aca5fd2233f3fa0926ffd03cfc8899100cda41863ae788128375a8322615f7a" exitCode=0 Mar 10 23:22:08 crc kubenswrapper[4919]: I0310 23:22:08.913028 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgd9s" event={"ID":"3865f2ad-9f56-4d05-a6ef-2bc21bb20548","Type":"ContainerDied","Data":"0aca5fd2233f3fa0926ffd03cfc8899100cda41863ae788128375a8322615f7a"} Mar 10 23:22:08 crc kubenswrapper[4919]: I0310 23:22:08.920455 4919 generic.go:334] "Generic (PLEG): container finished" podID="2f6ee806-3545-424a-8b52-3116d438d035" containerID="b09777193452a02e60f615a9cdab206f44857ab64ecd7d2f177ece6fe0a4594d" exitCode=0 Mar 10 23:22:08 crc kubenswrapper[4919]: I0310 23:22:08.920503 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9czm" event={"ID":"2f6ee806-3545-424a-8b52-3116d438d035","Type":"ContainerDied","Data":"b09777193452a02e60f615a9cdab206f44857ab64ecd7d2f177ece6fe0a4594d"} Mar 10 23:22:08 crc kubenswrapper[4919]: I0310 23:22:08.920530 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9czm" event={"ID":"2f6ee806-3545-424a-8b52-3116d438d035","Type":"ContainerStarted","Data":"ff7271f4d2baad492065a1f902f9097351fd463b4be501f7ca4fd57b740b6326"} Mar 10 23:22:08 crc kubenswrapper[4919]: I0310 23:22:08.934511 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pl5k6" podStartSLOduration=2.507539989 podStartE2EDuration="4.934489983s" podCreationTimestamp="2026-03-10 23:22:04 +0000 UTC" firstStartedPulling="2026-03-10 23:22:05.861168856 +0000 UTC m=+5513.103049474" lastFinishedPulling="2026-03-10 23:22:08.28811885 +0000 UTC m=+5515.529999468" observedRunningTime="2026-03-10 23:22:08.928107269 +0000 UTC m=+5516.169987877" watchObservedRunningTime="2026-03-10 23:22:08.934489983 +0000 UTC m=+5516.176370601" Mar 10 23:22:09 crc kubenswrapper[4919]: I0310 23:22:09.931543 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgd9s" event={"ID":"3865f2ad-9f56-4d05-a6ef-2bc21bb20548","Type":"ContainerStarted","Data":"19ab55a66150e5b0461b34187e5f3d9f38aa9284b10141673c938126eeae6883"} Mar 10 23:22:09 crc kubenswrapper[4919]: I0310 23:22:09.957507 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sgd9s" podStartSLOduration=3.504245579 podStartE2EDuration="4.957486819s" podCreationTimestamp="2026-03-10 23:22:05 +0000 UTC" firstStartedPulling="2026-03-10 23:22:07.903243302 +0000 UTC m=+5515.145123910" lastFinishedPulling="2026-03-10 23:22:09.356484542 +0000 UTC m=+5516.598365150" observedRunningTime="2026-03-10 23:22:09.949008948 +0000 UTC m=+5517.190889546" watchObservedRunningTime="2026-03-10 23:22:09.957486819 +0000 UTC m=+5517.199367437" Mar 10 23:22:13 crc kubenswrapper[4919]: I0310 23:22:13.985294 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9czm" event={"ID":"2f6ee806-3545-424a-8b52-3116d438d035","Type":"ContainerStarted","Data":"dffadb6c9666c869ccb6b3e34be666b87b340b2a79dbeb7eec5c83f40c860cfe"} Mar 10 23:22:14 crc kubenswrapper[4919]: I0310 23:22:14.483607 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:14 crc kubenswrapper[4919]: I0310 23:22:14.483714 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:14 crc kubenswrapper[4919]: I0310 23:22:14.641817 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:14 crc kubenswrapper[4919]: I0310 23:22:14.994641 4919 generic.go:334] "Generic (PLEG): container finished" podID="2f6ee806-3545-424a-8b52-3116d438d035" containerID="dffadb6c9666c869ccb6b3e34be666b87b340b2a79dbeb7eec5c83f40c860cfe" exitCode=0 Mar 10 23:22:14 crc kubenswrapper[4919]: I0310 23:22:14.996073 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9czm" event={"ID":"2f6ee806-3545-424a-8b52-3116d438d035","Type":"ContainerDied","Data":"dffadb6c9666c869ccb6b3e34be666b87b340b2a79dbeb7eec5c83f40c860cfe"} Mar 10 23:22:15 crc kubenswrapper[4919]: I0310 23:22:15.066474 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:16 crc kubenswrapper[4919]: I0310 23:22:16.039787 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9czm" event={"ID":"2f6ee806-3545-424a-8b52-3116d438d035","Type":"ContainerStarted","Data":"7fc2d9af48dcda3f372b6b7849e14fb71bdd1d7431608fa806110335b8849bd4"} Mar 10 23:22:16 crc kubenswrapper[4919]: I0310 23:22:16.060596 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s9czm" podStartSLOduration=2.522296351 podStartE2EDuration="9.060575086s" podCreationTimestamp="2026-03-10 23:22:07 +0000 UTC" firstStartedPulling="2026-03-10 23:22:08.922727242 +0000 UTC m=+5516.164607850" lastFinishedPulling="2026-03-10 23:22:15.461005957 +0000 UTC m=+5522.702886585" observedRunningTime="2026-03-10 23:22:16.057718928 +0000 UTC m=+5523.299599546" watchObservedRunningTime="2026-03-10 23:22:16.060575086 +0000 UTC m=+5523.302455704" Mar 10 23:22:16 crc kubenswrapper[4919]: I0310 23:22:16.280428 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:16 crc kubenswrapper[4919]: I0310 23:22:16.280493 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:16 crc kubenswrapper[4919]: I0310 23:22:16.321368 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:16 crc kubenswrapper[4919]: I0310 23:22:16.719456 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pl5k6"] Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.048936 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pl5k6" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerName="registry-server" containerID="cri-o://016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd" gracePeriod=2 Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.092103 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.516884 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.613255 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-catalog-content\") pod \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.613315 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-utilities\") pod \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.613403 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7khzl\" (UniqueName: \"kubernetes.io/projected/d1190774-892e-40dd-a15b-b5c7e5c20ac5-kube-api-access-7khzl\") pod \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\" (UID: \"d1190774-892e-40dd-a15b-b5c7e5c20ac5\") " Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.614585 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-utilities" (OuterVolumeSpecName: "utilities") pod "d1190774-892e-40dd-a15b-b5c7e5c20ac5" (UID: "d1190774-892e-40dd-a15b-b5c7e5c20ac5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.622179 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1190774-892e-40dd-a15b-b5c7e5c20ac5-kube-api-access-7khzl" (OuterVolumeSpecName: "kube-api-access-7khzl") pod "d1190774-892e-40dd-a15b-b5c7e5c20ac5" (UID: "d1190774-892e-40dd-a15b-b5c7e5c20ac5"). InnerVolumeSpecName "kube-api-access-7khzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.642699 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.642762 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.661910 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1190774-892e-40dd-a15b-b5c7e5c20ac5" (UID: "d1190774-892e-40dd-a15b-b5c7e5c20ac5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.715784 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.715821 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1190774-892e-40dd-a15b-b5c7e5c20ac5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:17 crc kubenswrapper[4919]: I0310 23:22:17.715835 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7khzl\" (UniqueName: \"kubernetes.io/projected/d1190774-892e-40dd-a15b-b5c7e5c20ac5-kube-api-access-7khzl\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.060530 4919 generic.go:334] "Generic (PLEG): container finished" podID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerID="016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd" exitCode=0 Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.060614 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pl5k6" event={"ID":"d1190774-892e-40dd-a15b-b5c7e5c20ac5","Type":"ContainerDied","Data":"016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd"} Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.060667 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pl5k6" event={"ID":"d1190774-892e-40dd-a15b-b5c7e5c20ac5","Type":"ContainerDied","Data":"2a7ddc06975961501ea4dd165c3d434c3ea7fbdff63e08092960f2ddef769563"} Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.060688 4919 scope.go:117] "RemoveContainer" containerID="016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.060632 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pl5k6" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.096280 4919 scope.go:117] "RemoveContainer" containerID="f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.108768 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pl5k6"] Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.116379 4919 scope.go:117] "RemoveContainer" containerID="2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.118473 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pl5k6"] Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.177746 4919 scope.go:117] "RemoveContainer" containerID="016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd" Mar 10 23:22:18 crc kubenswrapper[4919]: E0310 23:22:18.178286 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd\": container with ID starting with 016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd not found: ID does not exist" containerID="016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.178333 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd"} err="failed to get container status \"016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd\": rpc error: code = NotFound desc = could not find container \"016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd\": container with ID starting with 016909a4588a1bc004ad0783c23141edd278ce948a36f998cbac50e8917c4ddd not found: ID does not exist" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.178365 4919 scope.go:117] "RemoveContainer" containerID="f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7" Mar 10 23:22:18 crc kubenswrapper[4919]: E0310 23:22:18.178945 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7\": container with ID starting with f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7 not found: ID does not exist" containerID="f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.178970 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7"} err="failed to get container status \"f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7\": rpc error: code = NotFound desc = could not find container \"f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7\": container with ID starting with f130cc740a098fe02ff1a9cb8a2c0fc68e147c1c4a6f9736b819f78d42490de7 not found: ID does not exist" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.178985 4919 scope.go:117] "RemoveContainer" containerID="2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16" Mar 10 23:22:18 crc kubenswrapper[4919]: E0310 23:22:18.179536 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16\": container with ID starting with 2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16 not found: ID does not exist" containerID="2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.179562 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16"} err="failed to get container status \"2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16\": rpc error: code = NotFound desc = could not find container \"2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16\": container with ID starting with 2d1b44f3ec7a0d0782a29e0a01855f88e4ae2a49a8e64872355eb7875858fa16 not found: ID does not exist" Mar 10 23:22:18 crc kubenswrapper[4919]: I0310 23:22:18.684507 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s9czm" podUID="2f6ee806-3545-424a-8b52-3116d438d035" containerName="registry-server" probeResult="failure" output=< Mar 10 23:22:18 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 23:22:18 crc kubenswrapper[4919]: > Mar 10 23:22:19 crc kubenswrapper[4919]: I0310 23:22:19.114899 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgd9s"] Mar 10 23:22:19 crc kubenswrapper[4919]: I0310 23:22:19.115619 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sgd9s" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerName="registry-server" containerID="cri-o://19ab55a66150e5b0461b34187e5f3d9f38aa9284b10141673c938126eeae6883" gracePeriod=2 Mar 10 23:22:19 crc kubenswrapper[4919]: I0310 23:22:19.498448 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" path="/var/lib/kubelet/pods/d1190774-892e-40dd-a15b-b5c7e5c20ac5/volumes" Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.078616 4919 generic.go:334] "Generic (PLEG): container finished" podID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerID="19ab55a66150e5b0461b34187e5f3d9f38aa9284b10141673c938126eeae6883" exitCode=0 Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.078814 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgd9s" event={"ID":"3865f2ad-9f56-4d05-a6ef-2bc21bb20548","Type":"ContainerDied","Data":"19ab55a66150e5b0461b34187e5f3d9f38aa9284b10141673c938126eeae6883"} Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.680765 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.766926 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-catalog-content\") pod \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.767045 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-utilities\") pod \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.767233 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj52x\" (UniqueName: \"kubernetes.io/projected/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-kube-api-access-rj52x\") pod \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\" (UID: \"3865f2ad-9f56-4d05-a6ef-2bc21bb20548\") " Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.767980 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-utilities" (OuterVolumeSpecName: "utilities") pod "3865f2ad-9f56-4d05-a6ef-2bc21bb20548" (UID: "3865f2ad-9f56-4d05-a6ef-2bc21bb20548"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.773196 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-kube-api-access-rj52x" (OuterVolumeSpecName: "kube-api-access-rj52x") pod "3865f2ad-9f56-4d05-a6ef-2bc21bb20548" (UID: "3865f2ad-9f56-4d05-a6ef-2bc21bb20548"). InnerVolumeSpecName "kube-api-access-rj52x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.792730 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3865f2ad-9f56-4d05-a6ef-2bc21bb20548" (UID: "3865f2ad-9f56-4d05-a6ef-2bc21bb20548"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.869059 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.869087 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:20 crc kubenswrapper[4919]: I0310 23:22:20.869100 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj52x\" (UniqueName: \"kubernetes.io/projected/3865f2ad-9f56-4d05-a6ef-2bc21bb20548-kube-api-access-rj52x\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:21 crc kubenswrapper[4919]: I0310 23:22:21.087101 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgd9s" event={"ID":"3865f2ad-9f56-4d05-a6ef-2bc21bb20548","Type":"ContainerDied","Data":"c08c46c96e55b347ecd5b7b53720757b511b222aa01437bbc686649fb5cf0846"} Mar 10 23:22:21 crc kubenswrapper[4919]: I0310 23:22:21.087206 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgd9s" Mar 10 23:22:21 crc kubenswrapper[4919]: I0310 23:22:21.087366 4919 scope.go:117] "RemoveContainer" containerID="19ab55a66150e5b0461b34187e5f3d9f38aa9284b10141673c938126eeae6883" Mar 10 23:22:21 crc kubenswrapper[4919]: I0310 23:22:21.109688 4919 scope.go:117] "RemoveContainer" containerID="0aca5fd2233f3fa0926ffd03cfc8899100cda41863ae788128375a8322615f7a" Mar 10 23:22:21 crc kubenswrapper[4919]: I0310 23:22:21.143853 4919 scope.go:117] "RemoveContainer" containerID="900a548b3240ff7d1bc101399522ec8600966f523bb232ec6738ac4702705f35" Mar 10 23:22:21 crc kubenswrapper[4919]: I0310 23:22:21.151561 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgd9s"] Mar 10 23:22:21 crc kubenswrapper[4919]: I0310 23:22:21.157020 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgd9s"] Mar 10 23:22:21 crc kubenswrapper[4919]: I0310 23:22:21.490501 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" path="/var/lib/kubelet/pods/3865f2ad-9f56-4d05-a6ef-2bc21bb20548/volumes" Mar 10 23:22:23 crc kubenswrapper[4919]: I0310 23:22:23.559264 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7c8596f879-gt5t2" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.872938 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 23:22:24 crc kubenswrapper[4919]: E0310 23:22:24.873645 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerName="extract-content" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.873668 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerName="extract-content" Mar 10 23:22:24 crc kubenswrapper[4919]: E0310 23:22:24.873700 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerName="registry-server" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.873711 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerName="registry-server" Mar 10 23:22:24 crc kubenswrapper[4919]: E0310 23:22:24.873730 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerName="extract-content" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.873743 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerName="extract-content" Mar 10 23:22:24 crc kubenswrapper[4919]: E0310 23:22:24.873764 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerName="extract-utilities" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.873776 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerName="extract-utilities" Mar 10 23:22:24 crc kubenswrapper[4919]: E0310 23:22:24.873792 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerName="registry-server" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.873802 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerName="registry-server" Mar 10 23:22:24 crc kubenswrapper[4919]: E0310 23:22:24.873843 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerName="extract-utilities" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.873854 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerName="extract-utilities" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.874094 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1190774-892e-40dd-a15b-b5c7e5c20ac5" containerName="registry-server" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.874130 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="3865f2ad-9f56-4d05-a6ef-2bc21bb20548" containerName="registry-server" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.875094 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.880218 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.880660 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-tkj6t" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.880855 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.900620 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.936187 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165d12a6-fb5d-4a40-a903-3d8176434969-combined-ca-bundle\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.936332 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/165d12a6-fb5d-4a40-a903-3d8176434969-openstack-config-secret\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.936448 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/165d12a6-fb5d-4a40-a903-3d8176434969-openstack-config\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:24 crc kubenswrapper[4919]: I0310 23:22:24.936484 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmqr\" (UniqueName: \"kubernetes.io/projected/165d12a6-fb5d-4a40-a903-3d8176434969-kube-api-access-9wmqr\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.039353 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/165d12a6-fb5d-4a40-a903-3d8176434969-openstack-config-secret\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.039491 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/165d12a6-fb5d-4a40-a903-3d8176434969-openstack-config\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.039520 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmqr\" (UniqueName: \"kubernetes.io/projected/165d12a6-fb5d-4a40-a903-3d8176434969-kube-api-access-9wmqr\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.039582 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165d12a6-fb5d-4a40-a903-3d8176434969-combined-ca-bundle\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.042152 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/165d12a6-fb5d-4a40-a903-3d8176434969-openstack-config\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.046997 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165d12a6-fb5d-4a40-a903-3d8176434969-combined-ca-bundle\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.055836 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/165d12a6-fb5d-4a40-a903-3d8176434969-openstack-config-secret\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.067316 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmqr\" (UniqueName: \"kubernetes.io/projected/165d12a6-fb5d-4a40-a903-3d8176434969-kube-api-access-9wmqr\") pod \"openstackclient\" (UID: \"165d12a6-fb5d-4a40-a903-3d8176434969\") " pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.203973 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 23:22:25 crc kubenswrapper[4919]: I0310 23:22:25.647264 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 23:22:25 crc kubenswrapper[4919]: W0310 23:22:25.657943 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod165d12a6_fb5d_4a40_a903_3d8176434969.slice/crio-da4034295c171b601bbc029e0dcdbebdf11e7ca5abb25744e41c5fb92cf58b75 WatchSource:0}: Error finding container da4034295c171b601bbc029e0dcdbebdf11e7ca5abb25744e41c5fb92cf58b75: Status 404 returned error can't find the container with id da4034295c171b601bbc029e0dcdbebdf11e7ca5abb25744e41c5fb92cf58b75 Mar 10 23:22:26 crc kubenswrapper[4919]: I0310 23:22:26.134376 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"165d12a6-fb5d-4a40-a903-3d8176434969","Type":"ContainerStarted","Data":"3de46841baf1aff078fb3795f6d84df49123d7a40888a59d539c58b86f292a8f"} Mar 10 23:22:26 crc kubenswrapper[4919]: I0310 23:22:26.134476 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"165d12a6-fb5d-4a40-a903-3d8176434969","Type":"ContainerStarted","Data":"da4034295c171b601bbc029e0dcdbebdf11e7ca5abb25744e41c5fb92cf58b75"} Mar 10 23:22:26 crc kubenswrapper[4919]: I0310 23:22:26.162714 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.162693803 podStartE2EDuration="2.162693803s" podCreationTimestamp="2026-03-10 23:22:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:22:26.156909595 +0000 UTC m=+5533.398790243" watchObservedRunningTime="2026-03-10 23:22:26.162693803 +0000 UTC m=+5533.404574421" Mar 10 23:22:27 crc kubenswrapper[4919]: I0310 23:22:27.695534 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:27 crc kubenswrapper[4919]: I0310 23:22:27.760768 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s9czm" Mar 10 23:22:27 crc kubenswrapper[4919]: I0310 23:22:27.870111 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9czm"] Mar 10 23:22:27 crc kubenswrapper[4919]: I0310 23:22:27.947346 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxm8b"] Mar 10 23:22:27 crc kubenswrapper[4919]: I0310 23:22:27.947961 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xxm8b" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerName="registry-server" containerID="cri-o://5aee0092c1d493f23deb1789ca4f715fc37b89f6a7cbd18ecec584d2dfecb870" gracePeriod=2 Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.162438 4919 generic.go:334] "Generic (PLEG): container finished" podID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerID="5aee0092c1d493f23deb1789ca4f715fc37b89f6a7cbd18ecec584d2dfecb870" exitCode=0 Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.162581 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxm8b" event={"ID":"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7","Type":"ContainerDied","Data":"5aee0092c1d493f23deb1789ca4f715fc37b89f6a7cbd18ecec584d2dfecb870"} Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.323036 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.501806 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsvm6\" (UniqueName: \"kubernetes.io/projected/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-kube-api-access-xsvm6\") pod \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.501861 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-catalog-content\") pod \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.501964 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-utilities\") pod \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\" (UID: \"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7\") " Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.502548 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-utilities" (OuterVolumeSpecName: "utilities") pod "ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" (UID: "ff7f55ab-64a4-44d6-8f11-a28f589bcfd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.516603 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-kube-api-access-xsvm6" (OuterVolumeSpecName: "kube-api-access-xsvm6") pod "ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" (UID: "ff7f55ab-64a4-44d6-8f11-a28f589bcfd7"). InnerVolumeSpecName "kube-api-access-xsvm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.548695 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" (UID: "ff7f55ab-64a4-44d6-8f11-a28f589bcfd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.604158 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.604375 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsvm6\" (UniqueName: \"kubernetes.io/projected/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-kube-api-access-xsvm6\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:28 crc kubenswrapper[4919]: I0310 23:22:28.604453 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:22:29 crc kubenswrapper[4919]: I0310 23:22:29.171876 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxm8b" event={"ID":"ff7f55ab-64a4-44d6-8f11-a28f589bcfd7","Type":"ContainerDied","Data":"a23955d6b9537b7ea53621c7eaec30979ef2a78541d13857d5565331ed34d329"} Mar 10 23:22:29 crc kubenswrapper[4919]: I0310 23:22:29.171949 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxm8b" Mar 10 23:22:29 crc kubenswrapper[4919]: I0310 23:22:29.171958 4919 scope.go:117] "RemoveContainer" containerID="5aee0092c1d493f23deb1789ca4f715fc37b89f6a7cbd18ecec584d2dfecb870" Mar 10 23:22:29 crc kubenswrapper[4919]: I0310 23:22:29.218878 4919 scope.go:117] "RemoveContainer" containerID="ca089684278834a339b5a6889d4cf85122def52b92d6ca8d6229825ec05e7f62" Mar 10 23:22:29 crc kubenswrapper[4919]: I0310 23:22:29.226789 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxm8b"] Mar 10 23:22:29 crc kubenswrapper[4919]: I0310 23:22:29.242918 4919 scope.go:117] "RemoveContainer" containerID="b8a921d0eee6a88332744ea0594fdd10a0f2ec934cdb3cead809eae0572affa6" Mar 10 23:22:29 crc kubenswrapper[4919]: I0310 23:22:29.248240 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xxm8b"] Mar 10 23:22:29 crc kubenswrapper[4919]: I0310 23:22:29.488569 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" path="/var/lib/kubelet/pods/ff7f55ab-64a4-44d6-8f11-a28f589bcfd7/volumes" Mar 10 23:22:47 crc kubenswrapper[4919]: I0310 23:22:47.294151 4919 scope.go:117] "RemoveContainer" containerID="37ac229fb614b71e9a5fae14609fd5ae7ba4f80af9252601f9bdbfaa40d3b960" Mar 10 23:22:59 crc kubenswrapper[4919]: I0310 23:22:59.176603 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:22:59 crc kubenswrapper[4919]: I0310 23:22:59.177289 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:23:29 crc kubenswrapper[4919]: I0310 23:23:29.175743 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:23:29 crc kubenswrapper[4919]: I0310 23:23:29.176320 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:23:59 crc kubenswrapper[4919]: I0310 23:23:59.175771 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:23:59 crc kubenswrapper[4919]: I0310 23:23:59.176364 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:23:59 crc kubenswrapper[4919]: I0310 23:23:59.176445 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:23:59 crc kubenswrapper[4919]: I0310 23:23:59.177141 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:23:59 crc kubenswrapper[4919]: I0310 23:23:59.177195 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" gracePeriod=600 Mar 10 23:23:59 crc kubenswrapper[4919]: E0310 23:23:59.333486 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.017930 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" exitCode=0 Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.017991 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc"} Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.018035 4919 scope.go:117] "RemoveContainer" containerID="5ed85a234e315a1e8c0b68df55722ea097e0f8687391361bc1b4250e4cb84b0a" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.018686 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:24:00 crc kubenswrapper[4919]: E0310 23:24:00.018924 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.141727 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553084-x72px"] Mar 10 23:24:00 crc kubenswrapper[4919]: E0310 23:24:00.142049 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerName="extract-utilities" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.142067 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerName="extract-utilities" Mar 10 23:24:00 crc kubenswrapper[4919]: E0310 23:24:00.142081 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerName="registry-server" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.142087 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerName="registry-server" Mar 10 23:24:00 crc kubenswrapper[4919]: E0310 23:24:00.142109 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerName="extract-content" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.142115 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerName="extract-content" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.142267 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7f55ab-64a4-44d6-8f11-a28f589bcfd7" containerName="registry-server" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.142807 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553084-x72px" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.146889 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.147189 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.147410 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.164274 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553084-x72px"] Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.194572 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7nsv\" (UniqueName: \"kubernetes.io/projected/861c7fae-c56f-4836-b171-bf20f341437d-kube-api-access-v7nsv\") pod \"auto-csr-approver-29553084-x72px\" (UID: \"861c7fae-c56f-4836-b171-bf20f341437d\") " pod="openshift-infra/auto-csr-approver-29553084-x72px" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.297542 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7nsv\" (UniqueName: \"kubernetes.io/projected/861c7fae-c56f-4836-b171-bf20f341437d-kube-api-access-v7nsv\") pod \"auto-csr-approver-29553084-x72px\" (UID: \"861c7fae-c56f-4836-b171-bf20f341437d\") " pod="openshift-infra/auto-csr-approver-29553084-x72px" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.331320 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7nsv\" (UniqueName: \"kubernetes.io/projected/861c7fae-c56f-4836-b171-bf20f341437d-kube-api-access-v7nsv\") pod \"auto-csr-approver-29553084-x72px\" (UID: \"861c7fae-c56f-4836-b171-bf20f341437d\") " pod="openshift-infra/auto-csr-approver-29553084-x72px" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.461092 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553084-x72px" Mar 10 23:24:00 crc kubenswrapper[4919]: I0310 23:24:00.905378 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553084-x72px"] Mar 10 23:24:00 crc kubenswrapper[4919]: W0310 23:24:00.908358 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod861c7fae_c56f_4836_b171_bf20f341437d.slice/crio-48e2b1cb369180931f1e9c58b5675d57ada57a7970091a721af531f57568d87d WatchSource:0}: Error finding container 48e2b1cb369180931f1e9c58b5675d57ada57a7970091a721af531f57568d87d: Status 404 returned error can't find the container with id 48e2b1cb369180931f1e9c58b5675d57ada57a7970091a721af531f57568d87d Mar 10 23:24:01 crc kubenswrapper[4919]: I0310 23:24:01.037933 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553084-x72px" event={"ID":"861c7fae-c56f-4836-b171-bf20f341437d","Type":"ContainerStarted","Data":"48e2b1cb369180931f1e9c58b5675d57ada57a7970091a721af531f57568d87d"} Mar 10 23:24:03 crc kubenswrapper[4919]: I0310 23:24:03.056300 4919 generic.go:334] "Generic (PLEG): container finished" podID="861c7fae-c56f-4836-b171-bf20f341437d" containerID="125054635d5dca106b5c76648e9ff90d862350067b398dade3d763eacfc74aba" exitCode=0 Mar 10 23:24:03 crc kubenswrapper[4919]: I0310 23:24:03.056370 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553084-x72px" event={"ID":"861c7fae-c56f-4836-b171-bf20f341437d","Type":"ContainerDied","Data":"125054635d5dca106b5c76648e9ff90d862350067b398dade3d763eacfc74aba"} Mar 10 23:24:04 crc kubenswrapper[4919]: I0310 23:24:04.429026 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553084-x72px" Mar 10 23:24:04 crc kubenswrapper[4919]: I0310 23:24:04.563120 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7nsv\" (UniqueName: \"kubernetes.io/projected/861c7fae-c56f-4836-b171-bf20f341437d-kube-api-access-v7nsv\") pod \"861c7fae-c56f-4836-b171-bf20f341437d\" (UID: \"861c7fae-c56f-4836-b171-bf20f341437d\") " Mar 10 23:24:04 crc kubenswrapper[4919]: I0310 23:24:04.570876 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861c7fae-c56f-4836-b171-bf20f341437d-kube-api-access-v7nsv" (OuterVolumeSpecName: "kube-api-access-v7nsv") pod "861c7fae-c56f-4836-b171-bf20f341437d" (UID: "861c7fae-c56f-4836-b171-bf20f341437d"). InnerVolumeSpecName "kube-api-access-v7nsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:24:04 crc kubenswrapper[4919]: I0310 23:24:04.665960 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7nsv\" (UniqueName: \"kubernetes.io/projected/861c7fae-c56f-4836-b171-bf20f341437d-kube-api-access-v7nsv\") on node \"crc\" DevicePath \"\"" Mar 10 23:24:05 crc kubenswrapper[4919]: I0310 23:24:05.077492 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553084-x72px" event={"ID":"861c7fae-c56f-4836-b171-bf20f341437d","Type":"ContainerDied","Data":"48e2b1cb369180931f1e9c58b5675d57ada57a7970091a721af531f57568d87d"} Mar 10 23:24:05 crc kubenswrapper[4919]: I0310 23:24:05.077781 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48e2b1cb369180931f1e9c58b5675d57ada57a7970091a721af531f57568d87d" Mar 10 23:24:05 crc kubenswrapper[4919]: I0310 23:24:05.077691 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553084-x72px" Mar 10 23:24:05 crc kubenswrapper[4919]: I0310 23:24:05.504521 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553078-ndkn5"] Mar 10 23:24:05 crc kubenswrapper[4919]: I0310 23:24:05.512719 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553078-ndkn5"] Mar 10 23:24:07 crc kubenswrapper[4919]: I0310 23:24:07.491938 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae93b879-4c13-4274-8d35-9ab108c2922d" path="/var/lib/kubelet/pods/ae93b879-4c13-4274-8d35-9ab108c2922d/volumes" Mar 10 23:24:10 crc kubenswrapper[4919]: I0310 23:24:10.480202 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:24:10 crc kubenswrapper[4919]: E0310 23:24:10.481073 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:24:22 crc kubenswrapper[4919]: I0310 23:24:22.480757 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:24:22 crc kubenswrapper[4919]: E0310 23:24:22.481739 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:24:33 crc kubenswrapper[4919]: I0310 23:24:33.493020 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:24:33 crc kubenswrapper[4919]: E0310 23:24:33.493923 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:24:47 crc kubenswrapper[4919]: I0310 23:24:47.450559 4919 scope.go:117] "RemoveContainer" containerID="e0cd0c839d2d67aaec8e7382039914e169ea2bd9742a049061016e85c02e01ed" Mar 10 23:24:47 crc kubenswrapper[4919]: I0310 23:24:47.484311 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:24:47 crc kubenswrapper[4919]: E0310 23:24:47.484710 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:24:59 crc kubenswrapper[4919]: I0310 23:24:59.480911 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:24:59 crc kubenswrapper[4919]: E0310 23:24:59.482352 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:25:03 crc kubenswrapper[4919]: I0310 23:25:03.072707 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-msh6h"] Mar 10 23:25:03 crc kubenswrapper[4919]: I0310 23:25:03.084570 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-msh6h"] Mar 10 23:25:03 crc kubenswrapper[4919]: I0310 23:25:03.498547 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d07324-1300-42ef-a68c-2dc6f9548352" path="/var/lib/kubelet/pods/a7d07324-1300-42ef-a68c-2dc6f9548352/volumes" Mar 10 23:25:14 crc kubenswrapper[4919]: I0310 23:25:14.480479 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:25:14 crc kubenswrapper[4919]: E0310 23:25:14.481685 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:25:25 crc kubenswrapper[4919]: I0310 23:25:25.481342 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:25:25 crc kubenswrapper[4919]: E0310 23:25:25.482155 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:25:37 crc kubenswrapper[4919]: I0310 23:25:37.480580 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:25:37 crc kubenswrapper[4919]: E0310 23:25:37.481811 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:25:47 crc kubenswrapper[4919]: I0310 23:25:47.524850 4919 scope.go:117] "RemoveContainer" containerID="d1e058a8a51f6561649c6d390051f8caceeb232836bb3428b88feb597fdd6fa2" Mar 10 23:25:48 crc kubenswrapper[4919]: I0310 23:25:48.480827 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:25:48 crc kubenswrapper[4919]: E0310 23:25:48.481529 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.173673 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553086-wx4c6"] Mar 10 23:26:00 crc kubenswrapper[4919]: E0310 23:26:00.175323 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861c7fae-c56f-4836-b171-bf20f341437d" containerName="oc" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.175353 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="861c7fae-c56f-4836-b171-bf20f341437d" containerName="oc" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.175741 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="861c7fae-c56f-4836-b171-bf20f341437d" containerName="oc" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.176737 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553086-wx4c6" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.184874 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.185261 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.185582 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.187232 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553086-wx4c6"] Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.259854 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z8tf\" (UniqueName: \"kubernetes.io/projected/9db97bbc-a7b8-4e49-80ad-0e20f0217752-kube-api-access-2z8tf\") pod \"auto-csr-approver-29553086-wx4c6\" (UID: \"9db97bbc-a7b8-4e49-80ad-0e20f0217752\") " pod="openshift-infra/auto-csr-approver-29553086-wx4c6" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.361418 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z8tf\" (UniqueName: \"kubernetes.io/projected/9db97bbc-a7b8-4e49-80ad-0e20f0217752-kube-api-access-2z8tf\") pod \"auto-csr-approver-29553086-wx4c6\" (UID: \"9db97bbc-a7b8-4e49-80ad-0e20f0217752\") " pod="openshift-infra/auto-csr-approver-29553086-wx4c6" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.392575 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z8tf\" (UniqueName: \"kubernetes.io/projected/9db97bbc-a7b8-4e49-80ad-0e20f0217752-kube-api-access-2z8tf\") pod \"auto-csr-approver-29553086-wx4c6\" (UID: \"9db97bbc-a7b8-4e49-80ad-0e20f0217752\") " pod="openshift-infra/auto-csr-approver-29553086-wx4c6" Mar 10 23:26:00 crc kubenswrapper[4919]: I0310 23:26:00.519953 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553086-wx4c6" Mar 10 23:26:01 crc kubenswrapper[4919]: I0310 23:26:01.035902 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553086-wx4c6"] Mar 10 23:26:01 crc kubenswrapper[4919]: I0310 23:26:01.154661 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553086-wx4c6" event={"ID":"9db97bbc-a7b8-4e49-80ad-0e20f0217752","Type":"ContainerStarted","Data":"e0c14744195c21e586802b256d7781da767cebf0d577e0217e709415fcc86ee2"} Mar 10 23:26:02 crc kubenswrapper[4919]: I0310 23:26:02.480428 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:26:02 crc kubenswrapper[4919]: E0310 23:26:02.480967 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:26:03 crc kubenswrapper[4919]: I0310 23:26:03.174504 4919 generic.go:334] "Generic (PLEG): container finished" podID="9db97bbc-a7b8-4e49-80ad-0e20f0217752" containerID="f016f11200fc339df91ae6221b4cef87a85ec49086f3eb37888b52bb86244099" exitCode=0 Mar 10 23:26:03 crc kubenswrapper[4919]: I0310 23:26:03.174678 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553086-wx4c6" event={"ID":"9db97bbc-a7b8-4e49-80ad-0e20f0217752","Type":"ContainerDied","Data":"f016f11200fc339df91ae6221b4cef87a85ec49086f3eb37888b52bb86244099"} Mar 10 23:26:04 crc kubenswrapper[4919]: I0310 23:26:04.557255 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553086-wx4c6" Mar 10 23:26:04 crc kubenswrapper[4919]: I0310 23:26:04.650592 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z8tf\" (UniqueName: \"kubernetes.io/projected/9db97bbc-a7b8-4e49-80ad-0e20f0217752-kube-api-access-2z8tf\") pod \"9db97bbc-a7b8-4e49-80ad-0e20f0217752\" (UID: \"9db97bbc-a7b8-4e49-80ad-0e20f0217752\") " Mar 10 23:26:04 crc kubenswrapper[4919]: I0310 23:26:04.656867 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db97bbc-a7b8-4e49-80ad-0e20f0217752-kube-api-access-2z8tf" (OuterVolumeSpecName: "kube-api-access-2z8tf") pod "9db97bbc-a7b8-4e49-80ad-0e20f0217752" (UID: "9db97bbc-a7b8-4e49-80ad-0e20f0217752"). InnerVolumeSpecName "kube-api-access-2z8tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:26:04 crc kubenswrapper[4919]: I0310 23:26:04.752303 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z8tf\" (UniqueName: \"kubernetes.io/projected/9db97bbc-a7b8-4e49-80ad-0e20f0217752-kube-api-access-2z8tf\") on node \"crc\" DevicePath \"\"" Mar 10 23:26:05 crc kubenswrapper[4919]: I0310 23:26:05.194550 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553086-wx4c6" event={"ID":"9db97bbc-a7b8-4e49-80ad-0e20f0217752","Type":"ContainerDied","Data":"e0c14744195c21e586802b256d7781da767cebf0d577e0217e709415fcc86ee2"} Mar 10 23:26:05 crc kubenswrapper[4919]: I0310 23:26:05.194989 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0c14744195c21e586802b256d7781da767cebf0d577e0217e709415fcc86ee2" Mar 10 23:26:05 crc kubenswrapper[4919]: I0310 23:26:05.194671 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553086-wx4c6" Mar 10 23:26:05 crc kubenswrapper[4919]: I0310 23:26:05.468959 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553080-5z76m"] Mar 10 23:26:05 crc kubenswrapper[4919]: I0310 23:26:05.476015 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553080-5z76m"] Mar 10 23:26:05 crc kubenswrapper[4919]: I0310 23:26:05.496033 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3adab12-0a85-47f8-a8f0-ccbc0b4b275d" path="/var/lib/kubelet/pods/a3adab12-0a85-47f8-a8f0-ccbc0b4b275d/volumes" Mar 10 23:26:13 crc kubenswrapper[4919]: I0310 23:26:13.485714 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:26:13 crc kubenswrapper[4919]: E0310 23:26:13.486788 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:26:26 crc kubenswrapper[4919]: I0310 23:26:26.484775 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:26:26 crc kubenswrapper[4919]: E0310 23:26:26.485649 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:26:37 crc kubenswrapper[4919]: I0310 23:26:37.481046 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:26:37 crc kubenswrapper[4919]: E0310 23:26:37.481773 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:26:47 crc kubenswrapper[4919]: I0310 23:26:47.585065 4919 scope.go:117] "RemoveContainer" containerID="eafabcee6bed67be05d1a878a651478ba03e70bde5e961897e86648fe25aaa3d" Mar 10 23:26:47 crc kubenswrapper[4919]: I0310 23:26:47.630977 4919 scope.go:117] "RemoveContainer" containerID="00fc2bae4ab7c8ec504ccf63f06086294dcffabc24ea6aa26e253e1d55b64937" Mar 10 23:26:48 crc kubenswrapper[4919]: I0310 23:26:48.479956 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:26:48 crc kubenswrapper[4919]: E0310 23:26:48.480331 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:27:02 crc kubenswrapper[4919]: I0310 23:27:02.480535 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:27:02 crc kubenswrapper[4919]: E0310 23:27:02.481251 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:27:17 crc kubenswrapper[4919]: I0310 23:27:17.480466 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:27:17 crc kubenswrapper[4919]: E0310 23:27:17.481703 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.154007 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tcpds"] Mar 10 23:27:29 crc kubenswrapper[4919]: E0310 23:27:29.155104 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db97bbc-a7b8-4e49-80ad-0e20f0217752" containerName="oc" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.155127 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db97bbc-a7b8-4e49-80ad-0e20f0217752" containerName="oc" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.155459 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db97bbc-a7b8-4e49-80ad-0e20f0217752" containerName="oc" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.157577 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.190724 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tcpds"] Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.323818 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-catalog-content\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.324109 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969b8\" (UniqueName: \"kubernetes.io/projected/63621ecc-1746-4ea4-8339-bb56dde222ec-kube-api-access-969b8\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.324267 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-utilities\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.426130 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-969b8\" (UniqueName: \"kubernetes.io/projected/63621ecc-1746-4ea4-8339-bb56dde222ec-kube-api-access-969b8\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.426553 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-utilities\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.426592 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-catalog-content\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.427375 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-catalog-content\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.428278 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-utilities\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.448675 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-969b8\" (UniqueName: \"kubernetes.io/projected/63621ecc-1746-4ea4-8339-bb56dde222ec-kube-api-access-969b8\") pod \"redhat-operators-tcpds\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.484687 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:29 crc kubenswrapper[4919]: I0310 23:27:29.925465 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tcpds"] Mar 10 23:27:30 crc kubenswrapper[4919]: I0310 23:27:30.001103 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcpds" event={"ID":"63621ecc-1746-4ea4-8339-bb56dde222ec","Type":"ContainerStarted","Data":"f6e025a263de89aae3abd122869b059d47d97a456320a302232a5a02961033db"} Mar 10 23:27:31 crc kubenswrapper[4919]: I0310 23:27:31.015265 4919 generic.go:334] "Generic (PLEG): container finished" podID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerID="470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db" exitCode=0 Mar 10 23:27:31 crc kubenswrapper[4919]: I0310 23:27:31.020650 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcpds" event={"ID":"63621ecc-1746-4ea4-8339-bb56dde222ec","Type":"ContainerDied","Data":"470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db"} Mar 10 23:27:31 crc kubenswrapper[4919]: I0310 23:27:31.025249 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 23:27:32 crc kubenswrapper[4919]: I0310 23:27:32.030611 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcpds" event={"ID":"63621ecc-1746-4ea4-8339-bb56dde222ec","Type":"ContainerStarted","Data":"fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371"} Mar 10 23:27:32 crc kubenswrapper[4919]: I0310 23:27:32.480478 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:27:32 crc kubenswrapper[4919]: E0310 23:27:32.480894 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:27:33 crc kubenswrapper[4919]: I0310 23:27:33.045043 4919 generic.go:334] "Generic (PLEG): container finished" podID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerID="fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371" exitCode=0 Mar 10 23:27:33 crc kubenswrapper[4919]: I0310 23:27:33.045127 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcpds" event={"ID":"63621ecc-1746-4ea4-8339-bb56dde222ec","Type":"ContainerDied","Data":"fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371"} Mar 10 23:27:34 crc kubenswrapper[4919]: I0310 23:27:34.054997 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcpds" event={"ID":"63621ecc-1746-4ea4-8339-bb56dde222ec","Type":"ContainerStarted","Data":"800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793"} Mar 10 23:27:34 crc kubenswrapper[4919]: I0310 23:27:34.078617 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tcpds" podStartSLOduration=2.612132639 podStartE2EDuration="5.078595173s" podCreationTimestamp="2026-03-10 23:27:29 +0000 UTC" firstStartedPulling="2026-03-10 23:27:31.024909572 +0000 UTC m=+5838.266790180" lastFinishedPulling="2026-03-10 23:27:33.491372076 +0000 UTC m=+5840.733252714" observedRunningTime="2026-03-10 23:27:34.076007012 +0000 UTC m=+5841.317887620" watchObservedRunningTime="2026-03-10 23:27:34.078595173 +0000 UTC m=+5841.320475801" Mar 10 23:27:39 crc kubenswrapper[4919]: I0310 23:27:39.500754 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:39 crc kubenswrapper[4919]: I0310 23:27:39.501290 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:40 crc kubenswrapper[4919]: I0310 23:27:40.576463 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tcpds" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="registry-server" probeResult="failure" output=< Mar 10 23:27:40 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 23:27:40 crc kubenswrapper[4919]: > Mar 10 23:27:46 crc kubenswrapper[4919]: I0310 23:27:46.480768 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:27:46 crc kubenswrapper[4919]: E0310 23:27:46.482729 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:27:47 crc kubenswrapper[4919]: I0310 23:27:47.700238 4919 scope.go:117] "RemoveContainer" containerID="e6b38706ddc51ac06fd68b55fcbd250a38f1d602f1ea64638ebc97f024f4f9ee" Mar 10 23:27:49 crc kubenswrapper[4919]: I0310 23:27:49.552545 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:49 crc kubenswrapper[4919]: I0310 23:27:49.617367 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:49 crc kubenswrapper[4919]: I0310 23:27:49.804267 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tcpds"] Mar 10 23:27:50 crc kubenswrapper[4919]: I0310 23:27:50.664952 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tcpds" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="registry-server" containerID="cri-o://800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793" gracePeriod=2 Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.126466 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.265633 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-catalog-content\") pod \"63621ecc-1746-4ea4-8339-bb56dde222ec\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.265790 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-utilities\") pod \"63621ecc-1746-4ea4-8339-bb56dde222ec\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.265811 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-969b8\" (UniqueName: \"kubernetes.io/projected/63621ecc-1746-4ea4-8339-bb56dde222ec-kube-api-access-969b8\") pod \"63621ecc-1746-4ea4-8339-bb56dde222ec\" (UID: \"63621ecc-1746-4ea4-8339-bb56dde222ec\") " Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.266601 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-utilities" (OuterVolumeSpecName: "utilities") pod "63621ecc-1746-4ea4-8339-bb56dde222ec" (UID: "63621ecc-1746-4ea4-8339-bb56dde222ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.271557 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63621ecc-1746-4ea4-8339-bb56dde222ec-kube-api-access-969b8" (OuterVolumeSpecName: "kube-api-access-969b8") pod "63621ecc-1746-4ea4-8339-bb56dde222ec" (UID: "63621ecc-1746-4ea4-8339-bb56dde222ec"). InnerVolumeSpecName "kube-api-access-969b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.367754 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.367798 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-969b8\" (UniqueName: \"kubernetes.io/projected/63621ecc-1746-4ea4-8339-bb56dde222ec-kube-api-access-969b8\") on node \"crc\" DevicePath \"\"" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.387375 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63621ecc-1746-4ea4-8339-bb56dde222ec" (UID: "63621ecc-1746-4ea4-8339-bb56dde222ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.469199 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63621ecc-1746-4ea4-8339-bb56dde222ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.672189 4919 generic.go:334] "Generic (PLEG): container finished" podID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerID="800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793" exitCode=0 Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.672228 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcpds" event={"ID":"63621ecc-1746-4ea4-8339-bb56dde222ec","Type":"ContainerDied","Data":"800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793"} Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.672255 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tcpds" event={"ID":"63621ecc-1746-4ea4-8339-bb56dde222ec","Type":"ContainerDied","Data":"f6e025a263de89aae3abd122869b059d47d97a456320a302232a5a02961033db"} Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.672273 4919 scope.go:117] "RemoveContainer" containerID="800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.672288 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tcpds" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.696515 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tcpds"] Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.703209 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tcpds"] Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.704324 4919 scope.go:117] "RemoveContainer" containerID="fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.736338 4919 scope.go:117] "RemoveContainer" containerID="470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.759228 4919 scope.go:117] "RemoveContainer" containerID="800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793" Mar 10 23:27:51 crc kubenswrapper[4919]: E0310 23:27:51.759785 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793\": container with ID starting with 800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793 not found: ID does not exist" containerID="800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.759826 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793"} err="failed to get container status \"800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793\": rpc error: code = NotFound desc = could not find container \"800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793\": container with ID starting with 800551d77379b8006dfb74c7bd38da57637392c684cbe3def4f1e17fadf71793 not found: ID does not exist" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.759853 4919 scope.go:117] "RemoveContainer" containerID="fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371" Mar 10 23:27:51 crc kubenswrapper[4919]: E0310 23:27:51.760233 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371\": container with ID starting with fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371 not found: ID does not exist" containerID="fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.760384 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371"} err="failed to get container status \"fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371\": rpc error: code = NotFound desc = could not find container \"fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371\": container with ID starting with fe9821b978f5965e43b2df641db67e91c909ca163c53021d4f14ca7306acd371 not found: ID does not exist" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.760605 4919 scope.go:117] "RemoveContainer" containerID="470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db" Mar 10 23:27:51 crc kubenswrapper[4919]: E0310 23:27:51.761434 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db\": container with ID starting with 470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db not found: ID does not exist" containerID="470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db" Mar 10 23:27:51 crc kubenswrapper[4919]: I0310 23:27:51.761458 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db"} err="failed to get container status \"470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db\": rpc error: code = NotFound desc = could not find container \"470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db\": container with ID starting with 470854d14182be3b3bfbdab8547b0d865e730cb0dd5705f787253a290cd186db not found: ID does not exist" Mar 10 23:27:53 crc kubenswrapper[4919]: I0310 23:27:53.495018 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" path="/var/lib/kubelet/pods/63621ecc-1746-4ea4-8339-bb56dde222ec/volumes" Mar 10 23:27:59 crc kubenswrapper[4919]: I0310 23:27:59.480266 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:27:59 crc kubenswrapper[4919]: E0310 23:27:59.481083 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.160707 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553088-rb2n9"] Mar 10 23:28:00 crc kubenswrapper[4919]: E0310 23:28:00.161205 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="extract-utilities" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.161238 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="extract-utilities" Mar 10 23:28:00 crc kubenswrapper[4919]: E0310 23:28:00.161263 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="extract-content" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.161274 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="extract-content" Mar 10 23:28:00 crc kubenswrapper[4919]: E0310 23:28:00.161290 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="registry-server" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.161301 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="registry-server" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.161590 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="63621ecc-1746-4ea4-8339-bb56dde222ec" containerName="registry-server" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.162435 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553088-rb2n9" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.167630 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.168707 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.169153 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.174320 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553088-rb2n9"] Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.334876 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlnhj\" (UniqueName: \"kubernetes.io/projected/53fb8f64-55ce-4656-b544-1d1ebac9681f-kube-api-access-zlnhj\") pod \"auto-csr-approver-29553088-rb2n9\" (UID: \"53fb8f64-55ce-4656-b544-1d1ebac9681f\") " pod="openshift-infra/auto-csr-approver-29553088-rb2n9" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.436982 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlnhj\" (UniqueName: \"kubernetes.io/projected/53fb8f64-55ce-4656-b544-1d1ebac9681f-kube-api-access-zlnhj\") pod \"auto-csr-approver-29553088-rb2n9\" (UID: \"53fb8f64-55ce-4656-b544-1d1ebac9681f\") " pod="openshift-infra/auto-csr-approver-29553088-rb2n9" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.459892 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlnhj\" (UniqueName: \"kubernetes.io/projected/53fb8f64-55ce-4656-b544-1d1ebac9681f-kube-api-access-zlnhj\") pod \"auto-csr-approver-29553088-rb2n9\" (UID: \"53fb8f64-55ce-4656-b544-1d1ebac9681f\") " pod="openshift-infra/auto-csr-approver-29553088-rb2n9" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.524828 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553088-rb2n9" Mar 10 23:28:00 crc kubenswrapper[4919]: I0310 23:28:00.970662 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553088-rb2n9"] Mar 10 23:28:01 crc kubenswrapper[4919]: I0310 23:28:01.755425 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553088-rb2n9" event={"ID":"53fb8f64-55ce-4656-b544-1d1ebac9681f","Type":"ContainerStarted","Data":"3c42588e4bc02b41c62946f4b36fa8b847c9f56fd924b9cab9abf7ad01ed4de8"} Mar 10 23:28:02 crc kubenswrapper[4919]: I0310 23:28:02.765244 4919 generic.go:334] "Generic (PLEG): container finished" podID="53fb8f64-55ce-4656-b544-1d1ebac9681f" containerID="a8099016ddb7edb6afc54805d838e3f7e0d7fa81a7ba1b1b5d27a0c035dec8a9" exitCode=0 Mar 10 23:28:02 crc kubenswrapper[4919]: I0310 23:28:02.765333 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553088-rb2n9" event={"ID":"53fb8f64-55ce-4656-b544-1d1ebac9681f","Type":"ContainerDied","Data":"a8099016ddb7edb6afc54805d838e3f7e0d7fa81a7ba1b1b5d27a0c035dec8a9"} Mar 10 23:28:04 crc kubenswrapper[4919]: I0310 23:28:04.165434 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553088-rb2n9" Mar 10 23:28:04 crc kubenswrapper[4919]: I0310 23:28:04.316854 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlnhj\" (UniqueName: \"kubernetes.io/projected/53fb8f64-55ce-4656-b544-1d1ebac9681f-kube-api-access-zlnhj\") pod \"53fb8f64-55ce-4656-b544-1d1ebac9681f\" (UID: \"53fb8f64-55ce-4656-b544-1d1ebac9681f\") " Mar 10 23:28:04 crc kubenswrapper[4919]: I0310 23:28:04.324076 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fb8f64-55ce-4656-b544-1d1ebac9681f-kube-api-access-zlnhj" (OuterVolumeSpecName: "kube-api-access-zlnhj") pod "53fb8f64-55ce-4656-b544-1d1ebac9681f" (UID: "53fb8f64-55ce-4656-b544-1d1ebac9681f"). InnerVolumeSpecName "kube-api-access-zlnhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:28:04 crc kubenswrapper[4919]: I0310 23:28:04.419151 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlnhj\" (UniqueName: \"kubernetes.io/projected/53fb8f64-55ce-4656-b544-1d1ebac9681f-kube-api-access-zlnhj\") on node \"crc\" DevicePath \"\"" Mar 10 23:28:04 crc kubenswrapper[4919]: I0310 23:28:04.789277 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553088-rb2n9" event={"ID":"53fb8f64-55ce-4656-b544-1d1ebac9681f","Type":"ContainerDied","Data":"3c42588e4bc02b41c62946f4b36fa8b847c9f56fd924b9cab9abf7ad01ed4de8"} Mar 10 23:28:04 crc kubenswrapper[4919]: I0310 23:28:04.789332 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c42588e4bc02b41c62946f4b36fa8b847c9f56fd924b9cab9abf7ad01ed4de8" Mar 10 23:28:04 crc kubenswrapper[4919]: I0310 23:28:04.789371 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553088-rb2n9" Mar 10 23:28:05 crc kubenswrapper[4919]: I0310 23:28:05.251479 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553082-w8szq"] Mar 10 23:28:05 crc kubenswrapper[4919]: I0310 23:28:05.261096 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553082-w8szq"] Mar 10 23:28:05 crc kubenswrapper[4919]: I0310 23:28:05.495766 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5887dac0-0c52-4936-8d4d-5616781311c8" path="/var/lib/kubelet/pods/5887dac0-0c52-4936-8d4d-5616781311c8/volumes" Mar 10 23:28:13 crc kubenswrapper[4919]: I0310 23:28:13.491230 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:28:13 crc kubenswrapper[4919]: E0310 23:28:13.492393 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:28:25 crc kubenswrapper[4919]: I0310 23:28:25.480755 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:28:25 crc kubenswrapper[4919]: E0310 23:28:25.481617 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:28:37 crc kubenswrapper[4919]: I0310 23:28:37.480898 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:28:37 crc kubenswrapper[4919]: E0310 23:28:37.482033 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:28:47 crc kubenswrapper[4919]: I0310 23:28:47.783333 4919 scope.go:117] "RemoveContainer" containerID="d0ad6fa2b2d31bdf18fcff5ca063537748763fa764ab5f9529fff19af1fad576" Mar 10 23:28:50 crc kubenswrapper[4919]: I0310 23:28:50.480841 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:28:50 crc kubenswrapper[4919]: E0310 23:28:50.481850 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:29:04 crc kubenswrapper[4919]: I0310 23:29:04.481191 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:29:05 crc kubenswrapper[4919]: I0310 23:29:05.373714 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"6381e7ebeadb4b34bbcba77c714c147fb1e92469be16b46f210ac4fed6343795"} Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.165539 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553090-lkq6w"] Mar 10 23:30:00 crc kubenswrapper[4919]: E0310 23:30:00.166654 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fb8f64-55ce-4656-b544-1d1ebac9681f" containerName="oc" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.166674 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fb8f64-55ce-4656-b544-1d1ebac9681f" containerName="oc" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.166871 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="53fb8f64-55ce-4656-b544-1d1ebac9681f" containerName="oc" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.167596 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553090-lkq6w" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.173365 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.173852 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.173933 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.178145 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw"] Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.179444 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.184733 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.184985 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.192860 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553090-lkq6w"] Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.208211 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw"] Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.286815 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rslsx\" (UniqueName: \"kubernetes.io/projected/7411634e-354a-4051-a87b-e7c3a45f48b8-kube-api-access-rslsx\") pod \"auto-csr-approver-29553090-lkq6w\" (UID: \"7411634e-354a-4051-a87b-e7c3a45f48b8\") " pod="openshift-infra/auto-csr-approver-29553090-lkq6w" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.286925 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-config-volume\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.286948 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-secret-volume\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.287144 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75q7c\" (UniqueName: \"kubernetes.io/projected/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-kube-api-access-75q7c\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.388159 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rslsx\" (UniqueName: \"kubernetes.io/projected/7411634e-354a-4051-a87b-e7c3a45f48b8-kube-api-access-rslsx\") pod \"auto-csr-approver-29553090-lkq6w\" (UID: \"7411634e-354a-4051-a87b-e7c3a45f48b8\") " pod="openshift-infra/auto-csr-approver-29553090-lkq6w" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.388285 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-config-volume\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.388305 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-secret-volume\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.388348 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75q7c\" (UniqueName: \"kubernetes.io/projected/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-kube-api-access-75q7c\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.389343 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-config-volume\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.397580 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-secret-volume\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.408284 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75q7c\" (UniqueName: \"kubernetes.io/projected/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-kube-api-access-75q7c\") pod \"collect-profiles-29553090-lbrcw\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.415197 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rslsx\" (UniqueName: \"kubernetes.io/projected/7411634e-354a-4051-a87b-e7c3a45f48b8-kube-api-access-rslsx\") pod \"auto-csr-approver-29553090-lkq6w\" (UID: \"7411634e-354a-4051-a87b-e7c3a45f48b8\") " pod="openshift-infra/auto-csr-approver-29553090-lkq6w" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.495756 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553090-lkq6w" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.505126 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:00 crc kubenswrapper[4919]: I0310 23:30:00.994888 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553090-lkq6w"] Mar 10 23:30:00 crc kubenswrapper[4919]: W0310 23:30:00.998619 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7411634e_354a_4051_a87b_e7c3a45f48b8.slice/crio-dc36910b5cb5cbd1f37b20f7fd43d32d4d40dacc56bce227f17e85b954fcf76f WatchSource:0}: Error finding container dc36910b5cb5cbd1f37b20f7fd43d32d4d40dacc56bce227f17e85b954fcf76f: Status 404 returned error can't find the container with id dc36910b5cb5cbd1f37b20f7fd43d32d4d40dacc56bce227f17e85b954fcf76f Mar 10 23:30:01 crc kubenswrapper[4919]: I0310 23:30:01.082056 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw"] Mar 10 23:30:01 crc kubenswrapper[4919]: W0310 23:30:01.084729 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a1fb13_70b3_4610_b5d9_3b5fd57604e4.slice/crio-6a7ca1939cccc753dfc44bfa946875455669bd83747e7b4ebac34c37ff005880 WatchSource:0}: Error finding container 6a7ca1939cccc753dfc44bfa946875455669bd83747e7b4ebac34c37ff005880: Status 404 returned error can't find the container with id 6a7ca1939cccc753dfc44bfa946875455669bd83747e7b4ebac34c37ff005880 Mar 10 23:30:01 crc kubenswrapper[4919]: I0310 23:30:01.859978 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553090-lkq6w" event={"ID":"7411634e-354a-4051-a87b-e7c3a45f48b8","Type":"ContainerStarted","Data":"dc36910b5cb5cbd1f37b20f7fd43d32d4d40dacc56bce227f17e85b954fcf76f"} Mar 10 23:30:01 crc kubenswrapper[4919]: I0310 23:30:01.861707 4919 generic.go:334] "Generic (PLEG): container finished" podID="c9a1fb13-70b3-4610-b5d9-3b5fd57604e4" containerID="fa36de31f79de1715d2168af1a4efa77afed362acd2ff102a210d69a98c1a1c0" exitCode=0 Mar 10 23:30:01 crc kubenswrapper[4919]: I0310 23:30:01.861762 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" event={"ID":"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4","Type":"ContainerDied","Data":"fa36de31f79de1715d2168af1a4efa77afed362acd2ff102a210d69a98c1a1c0"} Mar 10 23:30:01 crc kubenswrapper[4919]: I0310 23:30:01.861800 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" event={"ID":"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4","Type":"ContainerStarted","Data":"6a7ca1939cccc753dfc44bfa946875455669bd83747e7b4ebac34c37ff005880"} Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.223250 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.249189 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-config-volume\") pod \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.249345 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-secret-volume\") pod \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.249660 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75q7c\" (UniqueName: \"kubernetes.io/projected/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-kube-api-access-75q7c\") pod \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\" (UID: \"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4\") " Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.250281 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "c9a1fb13-70b3-4610-b5d9-3b5fd57604e4" (UID: "c9a1fb13-70b3-4610-b5d9-3b5fd57604e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.252406 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.256100 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c9a1fb13-70b3-4610-b5d9-3b5fd57604e4" (UID: "c9a1fb13-70b3-4610-b5d9-3b5fd57604e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.256469 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-kube-api-access-75q7c" (OuterVolumeSpecName: "kube-api-access-75q7c") pod "c9a1fb13-70b3-4610-b5d9-3b5fd57604e4" (UID: "c9a1fb13-70b3-4610-b5d9-3b5fd57604e4"). InnerVolumeSpecName "kube-api-access-75q7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.354103 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75q7c\" (UniqueName: \"kubernetes.io/projected/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-kube-api-access-75q7c\") on node \"crc\" DevicePath \"\"" Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.354138 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9a1fb13-70b3-4610-b5d9-3b5fd57604e4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.882329 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" event={"ID":"c9a1fb13-70b3-4610-b5d9-3b5fd57604e4","Type":"ContainerDied","Data":"6a7ca1939cccc753dfc44bfa946875455669bd83747e7b4ebac34c37ff005880"} Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.882369 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a7ca1939cccc753dfc44bfa946875455669bd83747e7b4ebac34c37ff005880" Mar 10 23:30:03 crc kubenswrapper[4919]: I0310 23:30:03.882384 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553090-lbrcw" Mar 10 23:30:04 crc kubenswrapper[4919]: I0310 23:30:04.328118 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg"] Mar 10 23:30:04 crc kubenswrapper[4919]: I0310 23:30:04.341768 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553045-qjtdg"] Mar 10 23:30:04 crc kubenswrapper[4919]: I0310 23:30:04.890004 4919 generic.go:334] "Generic (PLEG): container finished" podID="7411634e-354a-4051-a87b-e7c3a45f48b8" containerID="72e10cdc972de4f4f56d47da7898831f53cc9d7b6df3c4859bceb81627cf764e" exitCode=0 Mar 10 23:30:04 crc kubenswrapper[4919]: I0310 23:30:04.890043 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553090-lkq6w" event={"ID":"7411634e-354a-4051-a87b-e7c3a45f48b8","Type":"ContainerDied","Data":"72e10cdc972de4f4f56d47da7898831f53cc9d7b6df3c4859bceb81627cf764e"} Mar 10 23:30:05 crc kubenswrapper[4919]: I0310 23:30:05.488768 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae254e6e-926c-44a5-b39e-c8cebf45b5a6" path="/var/lib/kubelet/pods/ae254e6e-926c-44a5-b39e-c8cebf45b5a6/volumes" Mar 10 23:30:06 crc kubenswrapper[4919]: I0310 23:30:06.210053 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553090-lkq6w" Mar 10 23:30:06 crc kubenswrapper[4919]: I0310 23:30:06.309127 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rslsx\" (UniqueName: \"kubernetes.io/projected/7411634e-354a-4051-a87b-e7c3a45f48b8-kube-api-access-rslsx\") pod \"7411634e-354a-4051-a87b-e7c3a45f48b8\" (UID: \"7411634e-354a-4051-a87b-e7c3a45f48b8\") " Mar 10 23:30:06 crc kubenswrapper[4919]: I0310 23:30:06.318086 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7411634e-354a-4051-a87b-e7c3a45f48b8-kube-api-access-rslsx" (OuterVolumeSpecName: "kube-api-access-rslsx") pod "7411634e-354a-4051-a87b-e7c3a45f48b8" (UID: "7411634e-354a-4051-a87b-e7c3a45f48b8"). InnerVolumeSpecName "kube-api-access-rslsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:30:06 crc kubenswrapper[4919]: I0310 23:30:06.410882 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rslsx\" (UniqueName: \"kubernetes.io/projected/7411634e-354a-4051-a87b-e7c3a45f48b8-kube-api-access-rslsx\") on node \"crc\" DevicePath \"\"" Mar 10 23:30:06 crc kubenswrapper[4919]: I0310 23:30:06.910965 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553090-lkq6w" event={"ID":"7411634e-354a-4051-a87b-e7c3a45f48b8","Type":"ContainerDied","Data":"dc36910b5cb5cbd1f37b20f7fd43d32d4d40dacc56bce227f17e85b954fcf76f"} Mar 10 23:30:06 crc kubenswrapper[4919]: I0310 23:30:06.911029 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc36910b5cb5cbd1f37b20f7fd43d32d4d40dacc56bce227f17e85b954fcf76f" Mar 10 23:30:06 crc kubenswrapper[4919]: I0310 23:30:06.911087 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553090-lkq6w" Mar 10 23:30:07 crc kubenswrapper[4919]: I0310 23:30:07.274325 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553084-x72px"] Mar 10 23:30:07 crc kubenswrapper[4919]: I0310 23:30:07.280021 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553084-x72px"] Mar 10 23:30:07 crc kubenswrapper[4919]: I0310 23:30:07.494308 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861c7fae-c56f-4836-b171-bf20f341437d" path="/var/lib/kubelet/pods/861c7fae-c56f-4836-b171-bf20f341437d/volumes" Mar 10 23:30:47 crc kubenswrapper[4919]: I0310 23:30:47.932592 4919 scope.go:117] "RemoveContainer" containerID="125054635d5dca106b5c76648e9ff90d862350067b398dade3d763eacfc74aba" Mar 10 23:30:47 crc kubenswrapper[4919]: I0310 23:30:47.973907 4919 scope.go:117] "RemoveContainer" containerID="bfeb4f6e055eb98f397864a2be27b7b069aae2fe12f7686eaf704a8d64d702cb" Mar 10 23:31:28 crc kubenswrapper[4919]: I0310 23:31:28.054785 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ck5qr"] Mar 10 23:31:28 crc kubenswrapper[4919]: I0310 23:31:28.062773 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-18ba-account-create-update-2wmzn"] Mar 10 23:31:28 crc kubenswrapper[4919]: I0310 23:31:28.068726 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-18ba-account-create-update-2wmzn"] Mar 10 23:31:28 crc kubenswrapper[4919]: I0310 23:31:28.084381 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ck5qr"] Mar 10 23:31:29 crc kubenswrapper[4919]: I0310 23:31:29.175620 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:31:29 crc kubenswrapper[4919]: I0310 23:31:29.175687 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:31:29 crc kubenswrapper[4919]: I0310 23:31:29.499313 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c" path="/var/lib/kubelet/pods/1df0c044-9ad4-4c72-bf4c-7bd4f5ce723c/volumes" Mar 10 23:31:29 crc kubenswrapper[4919]: I0310 23:31:29.500588 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526d94b1-75fd-42aa-a1bb-829018a77826" path="/var/lib/kubelet/pods/526d94b1-75fd-42aa-a1bb-829018a77826/volumes" Mar 10 23:31:36 crc kubenswrapper[4919]: I0310 23:31:36.053738 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-6cvb8"] Mar 10 23:31:36 crc kubenswrapper[4919]: I0310 23:31:36.062387 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-6cvb8"] Mar 10 23:31:37 crc kubenswrapper[4919]: I0310 23:31:37.497786 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd5ae8b-b383-4d88-945d-4b494b3e322e" path="/var/lib/kubelet/pods/7cd5ae8b-b383-4d88-945d-4b494b3e322e/volumes" Mar 10 23:31:48 crc kubenswrapper[4919]: I0310 23:31:48.063498 4919 scope.go:117] "RemoveContainer" containerID="28b04fa300293e024a7efe611602004613b6896d1434a6f92d09d617ccbeaedb" Mar 10 23:31:48 crc kubenswrapper[4919]: I0310 23:31:48.098103 4919 scope.go:117] "RemoveContainer" containerID="06195add5c6c87f0958d7c873603cf9e3bfc699a62ce88aa893f5c12f04a5554" Mar 10 23:31:48 crc kubenswrapper[4919]: I0310 23:31:48.176236 4919 scope.go:117] "RemoveContainer" containerID="4473f8e69522b250c4e26eeb3afad1666e65f2fe01029b72390dc8bdeab9382c" Mar 10 23:31:51 crc kubenswrapper[4919]: I0310 23:31:51.034480 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vgm8x"] Mar 10 23:31:51 crc kubenswrapper[4919]: I0310 23:31:51.044765 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vgm8x"] Mar 10 23:31:51 crc kubenswrapper[4919]: I0310 23:31:51.497078 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1e6096-3ede-4a8d-a82c-6c919dafb2d8" path="/var/lib/kubelet/pods/dd1e6096-3ede-4a8d-a82c-6c919dafb2d8/volumes" Mar 10 23:31:59 crc kubenswrapper[4919]: I0310 23:31:59.175619 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:31:59 crc kubenswrapper[4919]: I0310 23:31:59.176106 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.164868 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553092-fdlds"] Mar 10 23:32:00 crc kubenswrapper[4919]: E0310 23:32:00.165375 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7411634e-354a-4051-a87b-e7c3a45f48b8" containerName="oc" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.165834 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="7411634e-354a-4051-a87b-e7c3a45f48b8" containerName="oc" Mar 10 23:32:00 crc kubenswrapper[4919]: E0310 23:32:00.165898 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a1fb13-70b3-4610-b5d9-3b5fd57604e4" containerName="collect-profiles" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.165917 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a1fb13-70b3-4610-b5d9-3b5fd57604e4" containerName="collect-profiles" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.166210 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a1fb13-70b3-4610-b5d9-3b5fd57604e4" containerName="collect-profiles" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.166255 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="7411634e-354a-4051-a87b-e7c3a45f48b8" containerName="oc" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.167231 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553092-fdlds" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.176114 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.176816 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.180845 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.184522 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553092-fdlds"] Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.250667 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knd9s\" (UniqueName: \"kubernetes.io/projected/74a92de7-baac-4521-bf31-cb416ae4bd54-kube-api-access-knd9s\") pod \"auto-csr-approver-29553092-fdlds\" (UID: \"74a92de7-baac-4521-bf31-cb416ae4bd54\") " pod="openshift-infra/auto-csr-approver-29553092-fdlds" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.352586 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knd9s\" (UniqueName: \"kubernetes.io/projected/74a92de7-baac-4521-bf31-cb416ae4bd54-kube-api-access-knd9s\") pod \"auto-csr-approver-29553092-fdlds\" (UID: \"74a92de7-baac-4521-bf31-cb416ae4bd54\") " pod="openshift-infra/auto-csr-approver-29553092-fdlds" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.391864 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knd9s\" (UniqueName: \"kubernetes.io/projected/74a92de7-baac-4521-bf31-cb416ae4bd54-kube-api-access-knd9s\") pod \"auto-csr-approver-29553092-fdlds\" (UID: \"74a92de7-baac-4521-bf31-cb416ae4bd54\") " pod="openshift-infra/auto-csr-approver-29553092-fdlds" Mar 10 23:32:00 crc kubenswrapper[4919]: I0310 23:32:00.508644 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553092-fdlds" Mar 10 23:32:01 crc kubenswrapper[4919]: I0310 23:32:01.037096 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553092-fdlds"] Mar 10 23:32:01 crc kubenswrapper[4919]: I0310 23:32:01.997104 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553092-fdlds" event={"ID":"74a92de7-baac-4521-bf31-cb416ae4bd54","Type":"ContainerStarted","Data":"d7086505216b8973c2619f1c87b4d69d3b31e7d7ae4dcff46fdaaabaeaa09f03"} Mar 10 23:32:03 crc kubenswrapper[4919]: I0310 23:32:03.008524 4919 generic.go:334] "Generic (PLEG): container finished" podID="74a92de7-baac-4521-bf31-cb416ae4bd54" containerID="ca04b3b9e1245c4edc564fd540ff1da04cf8356d5a19f30ecc12ae162c0b54c8" exitCode=0 Mar 10 23:32:03 crc kubenswrapper[4919]: I0310 23:32:03.008652 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553092-fdlds" event={"ID":"74a92de7-baac-4521-bf31-cb416ae4bd54","Type":"ContainerDied","Data":"ca04b3b9e1245c4edc564fd540ff1da04cf8356d5a19f30ecc12ae162c0b54c8"} Mar 10 23:32:04 crc kubenswrapper[4919]: I0310 23:32:04.395035 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553092-fdlds" Mar 10 23:32:04 crc kubenswrapper[4919]: I0310 23:32:04.531151 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knd9s\" (UniqueName: \"kubernetes.io/projected/74a92de7-baac-4521-bf31-cb416ae4bd54-kube-api-access-knd9s\") pod \"74a92de7-baac-4521-bf31-cb416ae4bd54\" (UID: \"74a92de7-baac-4521-bf31-cb416ae4bd54\") " Mar 10 23:32:04 crc kubenswrapper[4919]: I0310 23:32:04.536837 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a92de7-baac-4521-bf31-cb416ae4bd54-kube-api-access-knd9s" (OuterVolumeSpecName: "kube-api-access-knd9s") pod "74a92de7-baac-4521-bf31-cb416ae4bd54" (UID: "74a92de7-baac-4521-bf31-cb416ae4bd54"). InnerVolumeSpecName "kube-api-access-knd9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:32:04 crc kubenswrapper[4919]: I0310 23:32:04.633541 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knd9s\" (UniqueName: \"kubernetes.io/projected/74a92de7-baac-4521-bf31-cb416ae4bd54-kube-api-access-knd9s\") on node \"crc\" DevicePath \"\"" Mar 10 23:32:05 crc kubenswrapper[4919]: I0310 23:32:05.027965 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553092-fdlds" event={"ID":"74a92de7-baac-4521-bf31-cb416ae4bd54","Type":"ContainerDied","Data":"d7086505216b8973c2619f1c87b4d69d3b31e7d7ae4dcff46fdaaabaeaa09f03"} Mar 10 23:32:05 crc kubenswrapper[4919]: I0310 23:32:05.028215 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7086505216b8973c2619f1c87b4d69d3b31e7d7ae4dcff46fdaaabaeaa09f03" Mar 10 23:32:05 crc kubenswrapper[4919]: I0310 23:32:05.028024 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553092-fdlds" Mar 10 23:32:05 crc kubenswrapper[4919]: I0310 23:32:05.534318 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553086-wx4c6"] Mar 10 23:32:05 crc kubenswrapper[4919]: I0310 23:32:05.542687 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553086-wx4c6"] Mar 10 23:32:07 crc kubenswrapper[4919]: I0310 23:32:07.497062 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db97bbc-a7b8-4e49-80ad-0e20f0217752" path="/var/lib/kubelet/pods/9db97bbc-a7b8-4e49-80ad-0e20f0217752/volumes" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.040162 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fn4j5"] Mar 10 23:32:17 crc kubenswrapper[4919]: E0310 23:32:17.041252 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a92de7-baac-4521-bf31-cb416ae4bd54" containerName="oc" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.041273 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a92de7-baac-4521-bf31-cb416ae4bd54" containerName="oc" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.041611 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a92de7-baac-4521-bf31-cb416ae4bd54" containerName="oc" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.047092 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.053591 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fn4j5"] Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.206946 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-utilities\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.207048 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-catalog-content\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.207218 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg8nm\" (UniqueName: \"kubernetes.io/projected/b346eb61-7a84-4815-8f0c-57802ae5d396-kube-api-access-hg8nm\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.309278 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-utilities\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.309339 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-catalog-content\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.309363 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg8nm\" (UniqueName: \"kubernetes.io/projected/b346eb61-7a84-4815-8f0c-57802ae5d396-kube-api-access-hg8nm\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.310176 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-utilities\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.310404 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-catalog-content\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.338228 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg8nm\" (UniqueName: \"kubernetes.io/projected/b346eb61-7a84-4815-8f0c-57802ae5d396-kube-api-access-hg8nm\") pod \"certified-operators-fn4j5\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.376888 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:17 crc kubenswrapper[4919]: I0310 23:32:17.884216 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fn4j5"] Mar 10 23:32:18 crc kubenswrapper[4919]: I0310 23:32:18.149173 4919 generic.go:334] "Generic (PLEG): container finished" podID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerID="91ec06a5d6ade2d97b84f37e80bacc706b634dab3a47b39677c9ba2b287a7917" exitCode=0 Mar 10 23:32:18 crc kubenswrapper[4919]: I0310 23:32:18.149249 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn4j5" event={"ID":"b346eb61-7a84-4815-8f0c-57802ae5d396","Type":"ContainerDied","Data":"91ec06a5d6ade2d97b84f37e80bacc706b634dab3a47b39677c9ba2b287a7917"} Mar 10 23:32:18 crc kubenswrapper[4919]: I0310 23:32:18.149280 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn4j5" event={"ID":"b346eb61-7a84-4815-8f0c-57802ae5d396","Type":"ContainerStarted","Data":"1f4c27203135636a6a46eee030f0a1d6af041665a259c9f31b3be3316d6a13e4"} Mar 10 23:32:20 crc kubenswrapper[4919]: I0310 23:32:20.169558 4919 generic.go:334] "Generic (PLEG): container finished" podID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerID="d56427461c28a8b6bba6a9f42f766317b3c91785dd1ebb9c67a43871bb2b5013" exitCode=0 Mar 10 23:32:20 crc kubenswrapper[4919]: I0310 23:32:20.169755 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn4j5" event={"ID":"b346eb61-7a84-4815-8f0c-57802ae5d396","Type":"ContainerDied","Data":"d56427461c28a8b6bba6a9f42f766317b3c91785dd1ebb9c67a43871bb2b5013"} Mar 10 23:32:21 crc kubenswrapper[4919]: I0310 23:32:21.182248 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn4j5" event={"ID":"b346eb61-7a84-4815-8f0c-57802ae5d396","Type":"ContainerStarted","Data":"fd142aaa573d119ac6a2889b7e671bf56e9e71d7fc3b14f60149e3d83b44e1fe"} Mar 10 23:32:21 crc kubenswrapper[4919]: I0310 23:32:21.207733 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fn4j5" podStartSLOduration=1.773331725 podStartE2EDuration="4.207709008s" podCreationTimestamp="2026-03-10 23:32:17 +0000 UTC" firstStartedPulling="2026-03-10 23:32:18.151479717 +0000 UTC m=+6125.393360365" lastFinishedPulling="2026-03-10 23:32:20.585857 +0000 UTC m=+6127.827737648" observedRunningTime="2026-03-10 23:32:21.200074567 +0000 UTC m=+6128.441955175" watchObservedRunningTime="2026-03-10 23:32:21.207709008 +0000 UTC m=+6128.449589616" Mar 10 23:32:27 crc kubenswrapper[4919]: I0310 23:32:27.378036 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:27 crc kubenswrapper[4919]: I0310 23:32:27.379235 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:27 crc kubenswrapper[4919]: I0310 23:32:27.445197 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:28 crc kubenswrapper[4919]: I0310 23:32:28.298230 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:28 crc kubenswrapper[4919]: I0310 23:32:28.362967 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fn4j5"] Mar 10 23:32:29 crc kubenswrapper[4919]: I0310 23:32:29.175567 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:32:29 crc kubenswrapper[4919]: I0310 23:32:29.175637 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:32:29 crc kubenswrapper[4919]: I0310 23:32:29.175697 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:32:29 crc kubenswrapper[4919]: I0310 23:32:29.176515 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6381e7ebeadb4b34bbcba77c714c147fb1e92469be16b46f210ac4fed6343795"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:32:29 crc kubenswrapper[4919]: I0310 23:32:29.176622 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://6381e7ebeadb4b34bbcba77c714c147fb1e92469be16b46f210ac4fed6343795" gracePeriod=600 Mar 10 23:32:30 crc kubenswrapper[4919]: I0310 23:32:30.276435 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="6381e7ebeadb4b34bbcba77c714c147fb1e92469be16b46f210ac4fed6343795" exitCode=0 Mar 10 23:32:30 crc kubenswrapper[4919]: I0310 23:32:30.276532 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"6381e7ebeadb4b34bbcba77c714c147fb1e92469be16b46f210ac4fed6343795"} Mar 10 23:32:30 crc kubenswrapper[4919]: I0310 23:32:30.276859 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee"} Mar 10 23:32:30 crc kubenswrapper[4919]: I0310 23:32:30.276891 4919 scope.go:117] "RemoveContainer" containerID="bd5f980b375940b8ac50763b940d3f98ebbeb27c9473430bf186d0966bbdfefc" Mar 10 23:32:30 crc kubenswrapper[4919]: I0310 23:32:30.277011 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fn4j5" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerName="registry-server" containerID="cri-o://fd142aaa573d119ac6a2889b7e671bf56e9e71d7fc3b14f60149e3d83b44e1fe" gracePeriod=2 Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.290863 4919 generic.go:334] "Generic (PLEG): container finished" podID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerID="fd142aaa573d119ac6a2889b7e671bf56e9e71d7fc3b14f60149e3d83b44e1fe" exitCode=0 Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.290964 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn4j5" event={"ID":"b346eb61-7a84-4815-8f0c-57802ae5d396","Type":"ContainerDied","Data":"fd142aaa573d119ac6a2889b7e671bf56e9e71d7fc3b14f60149e3d83b44e1fe"} Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.291676 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn4j5" event={"ID":"b346eb61-7a84-4815-8f0c-57802ae5d396","Type":"ContainerDied","Data":"1f4c27203135636a6a46eee030f0a1d6af041665a259c9f31b3be3316d6a13e4"} Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.291697 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f4c27203135636a6a46eee030f0a1d6af041665a259c9f31b3be3316d6a13e4" Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.291571 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.395602 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg8nm\" (UniqueName: \"kubernetes.io/projected/b346eb61-7a84-4815-8f0c-57802ae5d396-kube-api-access-hg8nm\") pod \"b346eb61-7a84-4815-8f0c-57802ae5d396\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.395667 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-catalog-content\") pod \"b346eb61-7a84-4815-8f0c-57802ae5d396\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.395710 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-utilities\") pod \"b346eb61-7a84-4815-8f0c-57802ae5d396\" (UID: \"b346eb61-7a84-4815-8f0c-57802ae5d396\") " Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.397086 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-utilities" (OuterVolumeSpecName: "utilities") pod "b346eb61-7a84-4815-8f0c-57802ae5d396" (UID: "b346eb61-7a84-4815-8f0c-57802ae5d396"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.401429 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b346eb61-7a84-4815-8f0c-57802ae5d396-kube-api-access-hg8nm" (OuterVolumeSpecName: "kube-api-access-hg8nm") pod "b346eb61-7a84-4815-8f0c-57802ae5d396" (UID: "b346eb61-7a84-4815-8f0c-57802ae5d396"). InnerVolumeSpecName "kube-api-access-hg8nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.457761 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b346eb61-7a84-4815-8f0c-57802ae5d396" (UID: "b346eb61-7a84-4815-8f0c-57802ae5d396"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.497019 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg8nm\" (UniqueName: \"kubernetes.io/projected/b346eb61-7a84-4815-8f0c-57802ae5d396-kube-api-access-hg8nm\") on node \"crc\" DevicePath \"\"" Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.497048 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:32:31 crc kubenswrapper[4919]: I0310 23:32:31.497057 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b346eb61-7a84-4815-8f0c-57802ae5d396-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:32:32 crc kubenswrapper[4919]: I0310 23:32:32.306584 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fn4j5" Mar 10 23:32:32 crc kubenswrapper[4919]: I0310 23:32:32.336310 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fn4j5"] Mar 10 23:32:32 crc kubenswrapper[4919]: I0310 23:32:32.343888 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fn4j5"] Mar 10 23:32:33 crc kubenswrapper[4919]: I0310 23:32:33.503008 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" path="/var/lib/kubelet/pods/b346eb61-7a84-4815-8f0c-57802ae5d396/volumes" Mar 10 23:32:48 crc kubenswrapper[4919]: I0310 23:32:48.261340 4919 scope.go:117] "RemoveContainer" containerID="aa1022fa595f8ec918291d8c569411d8edd6ca76b2a06abc402db412276c4b86" Mar 10 23:32:48 crc kubenswrapper[4919]: I0310 23:32:48.322806 4919 scope.go:117] "RemoveContainer" containerID="f016f11200fc339df91ae6221b4cef87a85ec49086f3eb37888b52bb86244099" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.185762 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553094-tqc5l"] Mar 10 23:34:00 crc kubenswrapper[4919]: E0310 23:34:00.186856 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerName="registry-server" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.186876 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerName="registry-server" Mar 10 23:34:00 crc kubenswrapper[4919]: E0310 23:34:00.186897 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerName="extract-content" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.186905 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerName="extract-content" Mar 10 23:34:00 crc kubenswrapper[4919]: E0310 23:34:00.186937 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerName="extract-utilities" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.186946 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerName="extract-utilities" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.187151 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="b346eb61-7a84-4815-8f0c-57802ae5d396" containerName="registry-server" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.187813 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553094-tqc5l" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.190550 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.190751 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.190931 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.196860 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553094-tqc5l"] Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.245812 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r86nv\" (UniqueName: \"kubernetes.io/projected/8c095361-8631-4a6c-8a04-a9f4a735880a-kube-api-access-r86nv\") pod \"auto-csr-approver-29553094-tqc5l\" (UID: \"8c095361-8631-4a6c-8a04-a9f4a735880a\") " pod="openshift-infra/auto-csr-approver-29553094-tqc5l" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.347661 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r86nv\" (UniqueName: \"kubernetes.io/projected/8c095361-8631-4a6c-8a04-a9f4a735880a-kube-api-access-r86nv\") pod \"auto-csr-approver-29553094-tqc5l\" (UID: \"8c095361-8631-4a6c-8a04-a9f4a735880a\") " pod="openshift-infra/auto-csr-approver-29553094-tqc5l" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.373798 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r86nv\" (UniqueName: \"kubernetes.io/projected/8c095361-8631-4a6c-8a04-a9f4a735880a-kube-api-access-r86nv\") pod \"auto-csr-approver-29553094-tqc5l\" (UID: \"8c095361-8631-4a6c-8a04-a9f4a735880a\") " pod="openshift-infra/auto-csr-approver-29553094-tqc5l" Mar 10 23:34:00 crc kubenswrapper[4919]: I0310 23:34:00.519935 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553094-tqc5l" Mar 10 23:34:01 crc kubenswrapper[4919]: I0310 23:34:01.040235 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553094-tqc5l"] Mar 10 23:34:01 crc kubenswrapper[4919]: I0310 23:34:01.051894 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 23:34:01 crc kubenswrapper[4919]: I0310 23:34:01.190964 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553094-tqc5l" event={"ID":"8c095361-8631-4a6c-8a04-a9f4a735880a","Type":"ContainerStarted","Data":"184078f9c92566a1e993b5a778d7af76bdb865c6214c0814fffec3031e6b2c47"} Mar 10 23:34:03 crc kubenswrapper[4919]: I0310 23:34:03.210888 4919 generic.go:334] "Generic (PLEG): container finished" podID="8c095361-8631-4a6c-8a04-a9f4a735880a" containerID="3cadf2e371e8f81ed5b8ee408bfe8cb4fb6c1841dde8bc0a7c63e38a481a7c0b" exitCode=0 Mar 10 23:34:03 crc kubenswrapper[4919]: I0310 23:34:03.211175 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553094-tqc5l" event={"ID":"8c095361-8631-4a6c-8a04-a9f4a735880a","Type":"ContainerDied","Data":"3cadf2e371e8f81ed5b8ee408bfe8cb4fb6c1841dde8bc0a7c63e38a481a7c0b"} Mar 10 23:34:04 crc kubenswrapper[4919]: I0310 23:34:04.537931 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553094-tqc5l" Mar 10 23:34:04 crc kubenswrapper[4919]: I0310 23:34:04.634629 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r86nv\" (UniqueName: \"kubernetes.io/projected/8c095361-8631-4a6c-8a04-a9f4a735880a-kube-api-access-r86nv\") pod \"8c095361-8631-4a6c-8a04-a9f4a735880a\" (UID: \"8c095361-8631-4a6c-8a04-a9f4a735880a\") " Mar 10 23:34:04 crc kubenswrapper[4919]: I0310 23:34:04.639994 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c095361-8631-4a6c-8a04-a9f4a735880a-kube-api-access-r86nv" (OuterVolumeSpecName: "kube-api-access-r86nv") pod "8c095361-8631-4a6c-8a04-a9f4a735880a" (UID: "8c095361-8631-4a6c-8a04-a9f4a735880a"). InnerVolumeSpecName "kube-api-access-r86nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:34:04 crc kubenswrapper[4919]: I0310 23:34:04.735817 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r86nv\" (UniqueName: \"kubernetes.io/projected/8c095361-8631-4a6c-8a04-a9f4a735880a-kube-api-access-r86nv\") on node \"crc\" DevicePath \"\"" Mar 10 23:34:05 crc kubenswrapper[4919]: I0310 23:34:05.240553 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553094-tqc5l" event={"ID":"8c095361-8631-4a6c-8a04-a9f4a735880a","Type":"ContainerDied","Data":"184078f9c92566a1e993b5a778d7af76bdb865c6214c0814fffec3031e6b2c47"} Mar 10 23:34:05 crc kubenswrapper[4919]: I0310 23:34:05.240605 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="184078f9c92566a1e993b5a778d7af76bdb865c6214c0814fffec3031e6b2c47" Mar 10 23:34:05 crc kubenswrapper[4919]: I0310 23:34:05.240726 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553094-tqc5l" Mar 10 23:34:05 crc kubenswrapper[4919]: I0310 23:34:05.631012 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553088-rb2n9"] Mar 10 23:34:05 crc kubenswrapper[4919]: I0310 23:34:05.639681 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553088-rb2n9"] Mar 10 23:34:07 crc kubenswrapper[4919]: I0310 23:34:07.496129 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53fb8f64-55ce-4656-b544-1d1ebac9681f" path="/var/lib/kubelet/pods/53fb8f64-55ce-4656-b544-1d1ebac9681f/volumes" Mar 10 23:34:48 crc kubenswrapper[4919]: I0310 23:34:48.443455 4919 scope.go:117] "RemoveContainer" containerID="a8099016ddb7edb6afc54805d838e3f7e0d7fa81a7ba1b1b5d27a0c035dec8a9" Mar 10 23:34:59 crc kubenswrapper[4919]: I0310 23:34:59.175296 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:34:59 crc kubenswrapper[4919]: I0310 23:34:59.175908 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:35:29 crc kubenswrapper[4919]: I0310 23:35:29.175819 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:35:29 crc kubenswrapper[4919]: I0310 23:35:29.176498 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:35:59 crc kubenswrapper[4919]: I0310 23:35:59.175894 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:35:59 crc kubenswrapper[4919]: I0310 23:35:59.176595 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:35:59 crc kubenswrapper[4919]: I0310 23:35:59.176663 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:35:59 crc kubenswrapper[4919]: I0310 23:35:59.177676 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:35:59 crc kubenswrapper[4919]: I0310 23:35:59.177881 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" gracePeriod=600 Mar 10 23:35:59 crc kubenswrapper[4919]: E0310 23:35:59.317480 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.166445 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553096-2n9ld"] Mar 10 23:36:00 crc kubenswrapper[4919]: E0310 23:36:00.167145 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c095361-8631-4a6c-8a04-a9f4a735880a" containerName="oc" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.167189 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c095361-8631-4a6c-8a04-a9f4a735880a" containerName="oc" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.167613 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c095361-8631-4a6c-8a04-a9f4a735880a" containerName="oc" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.168888 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553096-2n9ld" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.172677 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.175544 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.179052 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.198484 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553096-2n9ld"] Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.255263 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" exitCode=0 Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.255335 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee"} Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.255584 4919 scope.go:117] "RemoveContainer" containerID="6381e7ebeadb4b34bbcba77c714c147fb1e92469be16b46f210ac4fed6343795" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.256168 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:36:00 crc kubenswrapper[4919]: E0310 23:36:00.256504 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.359388 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghb49\" (UniqueName: \"kubernetes.io/projected/67e64a1a-500c-476f-bec9-ad4fa04765b0-kube-api-access-ghb49\") pod \"auto-csr-approver-29553096-2n9ld\" (UID: \"67e64a1a-500c-476f-bec9-ad4fa04765b0\") " pod="openshift-infra/auto-csr-approver-29553096-2n9ld" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.460660 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghb49\" (UniqueName: \"kubernetes.io/projected/67e64a1a-500c-476f-bec9-ad4fa04765b0-kube-api-access-ghb49\") pod \"auto-csr-approver-29553096-2n9ld\" (UID: \"67e64a1a-500c-476f-bec9-ad4fa04765b0\") " pod="openshift-infra/auto-csr-approver-29553096-2n9ld" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.494993 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghb49\" (UniqueName: \"kubernetes.io/projected/67e64a1a-500c-476f-bec9-ad4fa04765b0-kube-api-access-ghb49\") pod \"auto-csr-approver-29553096-2n9ld\" (UID: \"67e64a1a-500c-476f-bec9-ad4fa04765b0\") " pod="openshift-infra/auto-csr-approver-29553096-2n9ld" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.511012 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553096-2n9ld" Mar 10 23:36:00 crc kubenswrapper[4919]: I0310 23:36:00.958061 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553096-2n9ld"] Mar 10 23:36:01 crc kubenswrapper[4919]: I0310 23:36:01.267427 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553096-2n9ld" event={"ID":"67e64a1a-500c-476f-bec9-ad4fa04765b0","Type":"ContainerStarted","Data":"3116840779ae7f9f70d25fdfb6b77c24124171f1850018972e7e752379d43419"} Mar 10 23:36:02 crc kubenswrapper[4919]: I0310 23:36:02.280608 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553096-2n9ld" event={"ID":"67e64a1a-500c-476f-bec9-ad4fa04765b0","Type":"ContainerStarted","Data":"9b716b3991689d9dc6745310d9df4ff0c9a599514415ae3497bf51a4dcb7568f"} Mar 10 23:36:03 crc kubenswrapper[4919]: I0310 23:36:03.294142 4919 generic.go:334] "Generic (PLEG): container finished" podID="67e64a1a-500c-476f-bec9-ad4fa04765b0" containerID="9b716b3991689d9dc6745310d9df4ff0c9a599514415ae3497bf51a4dcb7568f" exitCode=0 Mar 10 23:36:03 crc kubenswrapper[4919]: I0310 23:36:03.294188 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553096-2n9ld" event={"ID":"67e64a1a-500c-476f-bec9-ad4fa04765b0","Type":"ContainerDied","Data":"9b716b3991689d9dc6745310d9df4ff0c9a599514415ae3497bf51a4dcb7568f"} Mar 10 23:36:04 crc kubenswrapper[4919]: I0310 23:36:04.680668 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553096-2n9ld" Mar 10 23:36:04 crc kubenswrapper[4919]: I0310 23:36:04.752522 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghb49\" (UniqueName: \"kubernetes.io/projected/67e64a1a-500c-476f-bec9-ad4fa04765b0-kube-api-access-ghb49\") pod \"67e64a1a-500c-476f-bec9-ad4fa04765b0\" (UID: \"67e64a1a-500c-476f-bec9-ad4fa04765b0\") " Mar 10 23:36:04 crc kubenswrapper[4919]: I0310 23:36:04.762707 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e64a1a-500c-476f-bec9-ad4fa04765b0-kube-api-access-ghb49" (OuterVolumeSpecName: "kube-api-access-ghb49") pod "67e64a1a-500c-476f-bec9-ad4fa04765b0" (UID: "67e64a1a-500c-476f-bec9-ad4fa04765b0"). InnerVolumeSpecName "kube-api-access-ghb49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:36:04 crc kubenswrapper[4919]: I0310 23:36:04.853784 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghb49\" (UniqueName: \"kubernetes.io/projected/67e64a1a-500c-476f-bec9-ad4fa04765b0-kube-api-access-ghb49\") on node \"crc\" DevicePath \"\"" Mar 10 23:36:05 crc kubenswrapper[4919]: I0310 23:36:05.320111 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553096-2n9ld" event={"ID":"67e64a1a-500c-476f-bec9-ad4fa04765b0","Type":"ContainerDied","Data":"3116840779ae7f9f70d25fdfb6b77c24124171f1850018972e7e752379d43419"} Mar 10 23:36:05 crc kubenswrapper[4919]: I0310 23:36:05.320171 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3116840779ae7f9f70d25fdfb6b77c24124171f1850018972e7e752379d43419" Mar 10 23:36:05 crc kubenswrapper[4919]: I0310 23:36:05.320234 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553096-2n9ld" Mar 10 23:36:05 crc kubenswrapper[4919]: I0310 23:36:05.373742 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553090-lkq6w"] Mar 10 23:36:05 crc kubenswrapper[4919]: I0310 23:36:05.383609 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553090-lkq6w"] Mar 10 23:36:05 crc kubenswrapper[4919]: I0310 23:36:05.493952 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7411634e-354a-4051-a87b-e7c3a45f48b8" path="/var/lib/kubelet/pods/7411634e-354a-4051-a87b-e7c3a45f48b8/volumes" Mar 10 23:36:12 crc kubenswrapper[4919]: I0310 23:36:12.480803 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:36:12 crc kubenswrapper[4919]: E0310 23:36:12.481973 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.388624 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tsb65"] Mar 10 23:36:16 crc kubenswrapper[4919]: E0310 23:36:16.389953 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e64a1a-500c-476f-bec9-ad4fa04765b0" containerName="oc" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.389976 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e64a1a-500c-476f-bec9-ad4fa04765b0" containerName="oc" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.390346 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e64a1a-500c-476f-bec9-ad4fa04765b0" containerName="oc" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.392744 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.402529 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsb65"] Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.576584 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-utilities\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.576652 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-catalog-content\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.576791 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcbfd\" (UniqueName: \"kubernetes.io/projected/28ca7d60-531e-4d13-b71b-b3b9301d1aed-kube-api-access-vcbfd\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.678979 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcbfd\" (UniqueName: \"kubernetes.io/projected/28ca7d60-531e-4d13-b71b-b3b9301d1aed-kube-api-access-vcbfd\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.679122 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-utilities\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.679167 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-catalog-content\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.679803 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-catalog-content\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.680293 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-utilities\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.720257 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcbfd\" (UniqueName: \"kubernetes.io/projected/28ca7d60-531e-4d13-b71b-b3b9301d1aed-kube-api-access-vcbfd\") pod \"redhat-marketplace-tsb65\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:16 crc kubenswrapper[4919]: I0310 23:36:16.729188 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:17 crc kubenswrapper[4919]: I0310 23:36:17.012675 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsb65"] Mar 10 23:36:17 crc kubenswrapper[4919]: W0310 23:36:17.022827 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ca7d60_531e_4d13_b71b_b3b9301d1aed.slice/crio-fc9ee7854bc042eea84a12dc23d34d481ddf76cc197bc5ee8d81ecd50a78a88e WatchSource:0}: Error finding container fc9ee7854bc042eea84a12dc23d34d481ddf76cc197bc5ee8d81ecd50a78a88e: Status 404 returned error can't find the container with id fc9ee7854bc042eea84a12dc23d34d481ddf76cc197bc5ee8d81ecd50a78a88e Mar 10 23:36:17 crc kubenswrapper[4919]: I0310 23:36:17.458571 4919 generic.go:334] "Generic (PLEG): container finished" podID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerID="bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167" exitCode=0 Mar 10 23:36:17 crc kubenswrapper[4919]: I0310 23:36:17.458645 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsb65" event={"ID":"28ca7d60-531e-4d13-b71b-b3b9301d1aed","Type":"ContainerDied","Data":"bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167"} Mar 10 23:36:17 crc kubenswrapper[4919]: I0310 23:36:17.459148 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsb65" event={"ID":"28ca7d60-531e-4d13-b71b-b3b9301d1aed","Type":"ContainerStarted","Data":"fc9ee7854bc042eea84a12dc23d34d481ddf76cc197bc5ee8d81ecd50a78a88e"} Mar 10 23:36:18 crc kubenswrapper[4919]: I0310 23:36:18.785299 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6jcl"] Mar 10 23:36:18 crc kubenswrapper[4919]: I0310 23:36:18.788672 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:18 crc kubenswrapper[4919]: I0310 23:36:18.797537 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6jcl"] Mar 10 23:36:18 crc kubenswrapper[4919]: I0310 23:36:18.928647 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcwn9\" (UniqueName: \"kubernetes.io/projected/488c1761-7e4e-442e-b645-e3869754207f-kube-api-access-wcwn9\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:18 crc kubenswrapper[4919]: I0310 23:36:18.929011 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-utilities\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:18 crc kubenswrapper[4919]: I0310 23:36:18.929120 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-catalog-content\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.030235 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcwn9\" (UniqueName: \"kubernetes.io/projected/488c1761-7e4e-442e-b645-e3869754207f-kube-api-access-wcwn9\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.030353 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-utilities\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.030410 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-catalog-content\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.030847 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-utilities\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.030880 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-catalog-content\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.056346 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcwn9\" (UniqueName: \"kubernetes.io/projected/488c1761-7e4e-442e-b645-e3869754207f-kube-api-access-wcwn9\") pod \"community-operators-j6jcl\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.138118 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.483602 4919 generic.go:334] "Generic (PLEG): container finished" podID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerID="9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac" exitCode=0 Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.489151 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsb65" event={"ID":"28ca7d60-531e-4d13-b71b-b3b9301d1aed","Type":"ContainerDied","Data":"9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac"} Mar 10 23:36:19 crc kubenswrapper[4919]: I0310 23:36:19.689167 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6jcl"] Mar 10 23:36:20 crc kubenswrapper[4919]: I0310 23:36:20.494869 4919 generic.go:334] "Generic (PLEG): container finished" podID="488c1761-7e4e-442e-b645-e3869754207f" containerID="4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea" exitCode=0 Mar 10 23:36:20 crc kubenswrapper[4919]: I0310 23:36:20.494973 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6jcl" event={"ID":"488c1761-7e4e-442e-b645-e3869754207f","Type":"ContainerDied","Data":"4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea"} Mar 10 23:36:20 crc kubenswrapper[4919]: I0310 23:36:20.495288 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6jcl" event={"ID":"488c1761-7e4e-442e-b645-e3869754207f","Type":"ContainerStarted","Data":"93098f70aef4e9521a0f29692a77bc7367a9e992b493e77aeb0599f1728e47a4"} Mar 10 23:36:20 crc kubenswrapper[4919]: I0310 23:36:20.508161 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsb65" event={"ID":"28ca7d60-531e-4d13-b71b-b3b9301d1aed","Type":"ContainerStarted","Data":"e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65"} Mar 10 23:36:20 crc kubenswrapper[4919]: I0310 23:36:20.559104 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tsb65" podStartSLOduration=2.097353896 podStartE2EDuration="4.55908514s" podCreationTimestamp="2026-03-10 23:36:16 +0000 UTC" firstStartedPulling="2026-03-10 23:36:17.462885162 +0000 UTC m=+6364.704765780" lastFinishedPulling="2026-03-10 23:36:19.924616416 +0000 UTC m=+6367.166497024" observedRunningTime="2026-03-10 23:36:20.556790118 +0000 UTC m=+6367.798670726" watchObservedRunningTime="2026-03-10 23:36:20.55908514 +0000 UTC m=+6367.800965768" Mar 10 23:36:22 crc kubenswrapper[4919]: I0310 23:36:22.530272 4919 generic.go:334] "Generic (PLEG): container finished" podID="488c1761-7e4e-442e-b645-e3869754207f" containerID="9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1" exitCode=0 Mar 10 23:36:22 crc kubenswrapper[4919]: I0310 23:36:22.530334 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6jcl" event={"ID":"488c1761-7e4e-442e-b645-e3869754207f","Type":"ContainerDied","Data":"9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1"} Mar 10 23:36:23 crc kubenswrapper[4919]: I0310 23:36:23.544715 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6jcl" event={"ID":"488c1761-7e4e-442e-b645-e3869754207f","Type":"ContainerStarted","Data":"a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d"} Mar 10 23:36:23 crc kubenswrapper[4919]: I0310 23:36:23.578079 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6jcl" podStartSLOduration=3.106096172 podStartE2EDuration="5.578051068s" podCreationTimestamp="2026-03-10 23:36:18 +0000 UTC" firstStartedPulling="2026-03-10 23:36:20.505888939 +0000 UTC m=+6367.747769547" lastFinishedPulling="2026-03-10 23:36:22.977843795 +0000 UTC m=+6370.219724443" observedRunningTime="2026-03-10 23:36:23.569235495 +0000 UTC m=+6370.811116123" watchObservedRunningTime="2026-03-10 23:36:23.578051068 +0000 UTC m=+6370.819931706" Mar 10 23:36:25 crc kubenswrapper[4919]: I0310 23:36:25.481072 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:36:25 crc kubenswrapper[4919]: E0310 23:36:25.481831 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:36:26 crc kubenswrapper[4919]: I0310 23:36:26.730283 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:26 crc kubenswrapper[4919]: I0310 23:36:26.730387 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:26 crc kubenswrapper[4919]: I0310 23:36:26.788351 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:27 crc kubenswrapper[4919]: I0310 23:36:27.662460 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:27 crc kubenswrapper[4919]: I0310 23:36:27.963088 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsb65"] Mar 10 23:36:29 crc kubenswrapper[4919]: I0310 23:36:29.138507 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:29 crc kubenswrapper[4919]: I0310 23:36:29.138892 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:29 crc kubenswrapper[4919]: I0310 23:36:29.227870 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:29 crc kubenswrapper[4919]: I0310 23:36:29.601609 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tsb65" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerName="registry-server" containerID="cri-o://e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65" gracePeriod=2 Mar 10 23:36:29 crc kubenswrapper[4919]: I0310 23:36:29.661546 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.093561 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.256998 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-utilities\") pod \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.257162 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-catalog-content\") pod \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.257244 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcbfd\" (UniqueName: \"kubernetes.io/projected/28ca7d60-531e-4d13-b71b-b3b9301d1aed-kube-api-access-vcbfd\") pod \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\" (UID: \"28ca7d60-531e-4d13-b71b-b3b9301d1aed\") " Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.258111 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-utilities" (OuterVolumeSpecName: "utilities") pod "28ca7d60-531e-4d13-b71b-b3b9301d1aed" (UID: "28ca7d60-531e-4d13-b71b-b3b9301d1aed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.265939 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ca7d60-531e-4d13-b71b-b3b9301d1aed-kube-api-access-vcbfd" (OuterVolumeSpecName: "kube-api-access-vcbfd") pod "28ca7d60-531e-4d13-b71b-b3b9301d1aed" (UID: "28ca7d60-531e-4d13-b71b-b3b9301d1aed"). InnerVolumeSpecName "kube-api-access-vcbfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.286251 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28ca7d60-531e-4d13-b71b-b3b9301d1aed" (UID: "28ca7d60-531e-4d13-b71b-b3b9301d1aed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.359606 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.359646 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcbfd\" (UniqueName: \"kubernetes.io/projected/28ca7d60-531e-4d13-b71b-b3b9301d1aed-kube-api-access-vcbfd\") on node \"crc\" DevicePath \"\"" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.359656 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ca7d60-531e-4d13-b71b-b3b9301d1aed-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.567381 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6jcl"] Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.617046 4919 generic.go:334] "Generic (PLEG): container finished" podID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerID="e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65" exitCode=0 Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.617128 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsb65" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.617121 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsb65" event={"ID":"28ca7d60-531e-4d13-b71b-b3b9301d1aed","Type":"ContainerDied","Data":"e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65"} Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.617240 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsb65" event={"ID":"28ca7d60-531e-4d13-b71b-b3b9301d1aed","Type":"ContainerDied","Data":"fc9ee7854bc042eea84a12dc23d34d481ddf76cc197bc5ee8d81ecd50a78a88e"} Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.617280 4919 scope.go:117] "RemoveContainer" containerID="e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.652009 4919 scope.go:117] "RemoveContainer" containerID="9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.677641 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsb65"] Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.688151 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsb65"] Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.705145 4919 scope.go:117] "RemoveContainer" containerID="bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.744210 4919 scope.go:117] "RemoveContainer" containerID="e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65" Mar 10 23:36:30 crc kubenswrapper[4919]: E0310 23:36:30.746007 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65\": container with ID starting with e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65 not found: ID does not exist" containerID="e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.746046 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65"} err="failed to get container status \"e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65\": rpc error: code = NotFound desc = could not find container \"e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65\": container with ID starting with e8ab7d9ba49c1717d777f5467836fa9038d08682198cf9913b5e85d893ca9f65 not found: ID does not exist" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.746073 4919 scope.go:117] "RemoveContainer" containerID="9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac" Mar 10 23:36:30 crc kubenswrapper[4919]: E0310 23:36:30.746521 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac\": container with ID starting with 9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac not found: ID does not exist" containerID="9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.746558 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac"} err="failed to get container status \"9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac\": rpc error: code = NotFound desc = could not find container \"9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac\": container with ID starting with 9f5b8999158d4d5a90a1ac4345fad3379ca8dec6d8f1d9ef0270d47406db37ac not found: ID does not exist" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.746574 4919 scope.go:117] "RemoveContainer" containerID="bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167" Mar 10 23:36:30 crc kubenswrapper[4919]: E0310 23:36:30.746992 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167\": container with ID starting with bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167 not found: ID does not exist" containerID="bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167" Mar 10 23:36:30 crc kubenswrapper[4919]: I0310 23:36:30.747014 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167"} err="failed to get container status \"bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167\": rpc error: code = NotFound desc = could not find container \"bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167\": container with ID starting with bd9b9a89f21dac6e32cc14c59e4dc55221197082cfe14ec196b3e8da9164f167 not found: ID does not exist" Mar 10 23:36:31 crc kubenswrapper[4919]: I0310 23:36:31.495239 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" path="/var/lib/kubelet/pods/28ca7d60-531e-4d13-b71b-b3b9301d1aed/volumes" Mar 10 23:36:31 crc kubenswrapper[4919]: I0310 23:36:31.628232 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6jcl" podUID="488c1761-7e4e-442e-b645-e3869754207f" containerName="registry-server" containerID="cri-o://a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d" gracePeriod=2 Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.097147 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.194158 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-catalog-content\") pod \"488c1761-7e4e-442e-b645-e3869754207f\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.194258 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcwn9\" (UniqueName: \"kubernetes.io/projected/488c1761-7e4e-442e-b645-e3869754207f-kube-api-access-wcwn9\") pod \"488c1761-7e4e-442e-b645-e3869754207f\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.194303 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-utilities\") pod \"488c1761-7e4e-442e-b645-e3869754207f\" (UID: \"488c1761-7e4e-442e-b645-e3869754207f\") " Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.196022 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-utilities" (OuterVolumeSpecName: "utilities") pod "488c1761-7e4e-442e-b645-e3869754207f" (UID: "488c1761-7e4e-442e-b645-e3869754207f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.204102 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488c1761-7e4e-442e-b645-e3869754207f-kube-api-access-wcwn9" (OuterVolumeSpecName: "kube-api-access-wcwn9") pod "488c1761-7e4e-442e-b645-e3869754207f" (UID: "488c1761-7e4e-442e-b645-e3869754207f"). InnerVolumeSpecName "kube-api-access-wcwn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.296183 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcwn9\" (UniqueName: \"kubernetes.io/projected/488c1761-7e4e-442e-b645-e3869754207f-kube-api-access-wcwn9\") on node \"crc\" DevicePath \"\"" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.296214 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.394054 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "488c1761-7e4e-442e-b645-e3869754207f" (UID: "488c1761-7e4e-442e-b645-e3869754207f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.398069 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488c1761-7e4e-442e-b645-e3869754207f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.637874 4919 generic.go:334] "Generic (PLEG): container finished" podID="488c1761-7e4e-442e-b645-e3869754207f" containerID="a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d" exitCode=0 Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.637922 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6jcl" event={"ID":"488c1761-7e4e-442e-b645-e3869754207f","Type":"ContainerDied","Data":"a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d"} Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.637955 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6jcl" event={"ID":"488c1761-7e4e-442e-b645-e3869754207f","Type":"ContainerDied","Data":"93098f70aef4e9521a0f29692a77bc7367a9e992b493e77aeb0599f1728e47a4"} Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.637972 4919 scope.go:117] "RemoveContainer" containerID="a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.637986 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6jcl" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.665504 4919 scope.go:117] "RemoveContainer" containerID="9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.678596 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6jcl"] Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.684479 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6jcl"] Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.707882 4919 scope.go:117] "RemoveContainer" containerID="4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.731136 4919 scope.go:117] "RemoveContainer" containerID="a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d" Mar 10 23:36:32 crc kubenswrapper[4919]: E0310 23:36:32.731614 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d\": container with ID starting with a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d not found: ID does not exist" containerID="a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.731652 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d"} err="failed to get container status \"a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d\": rpc error: code = NotFound desc = could not find container \"a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d\": container with ID starting with a3e7c56cd652c4555709c142b948d4856047c69bc83b45bb258a104a3fd2e96d not found: ID does not exist" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.731678 4919 scope.go:117] "RemoveContainer" containerID="9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1" Mar 10 23:36:32 crc kubenswrapper[4919]: E0310 23:36:32.732042 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1\": container with ID starting with 9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1 not found: ID does not exist" containerID="9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.732075 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1"} err="failed to get container status \"9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1\": rpc error: code = NotFound desc = could not find container \"9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1\": container with ID starting with 9a6d3aa635fd92d7e1e0ccd9cd839a8feb49be88b6851377522a42d3129bacb1 not found: ID does not exist" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.732093 4919 scope.go:117] "RemoveContainer" containerID="4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea" Mar 10 23:36:32 crc kubenswrapper[4919]: E0310 23:36:32.732473 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea\": container with ID starting with 4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea not found: ID does not exist" containerID="4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea" Mar 10 23:36:32 crc kubenswrapper[4919]: I0310 23:36:32.732520 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea"} err="failed to get container status \"4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea\": rpc error: code = NotFound desc = could not find container \"4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea\": container with ID starting with 4a70a4ef142915b421b1c89de9b2dd1679a9c5d051de3ffe2f256245e0c0a6ea not found: ID does not exist" Mar 10 23:36:33 crc kubenswrapper[4919]: I0310 23:36:33.492635 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488c1761-7e4e-442e-b645-e3869754207f" path="/var/lib/kubelet/pods/488c1761-7e4e-442e-b645-e3869754207f/volumes" Mar 10 23:36:38 crc kubenswrapper[4919]: I0310 23:36:38.480647 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:36:38 crc kubenswrapper[4919]: E0310 23:36:38.481481 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:36:48 crc kubenswrapper[4919]: I0310 23:36:48.534185 4919 scope.go:117] "RemoveContainer" containerID="72e10cdc972de4f4f56d47da7898831f53cc9d7b6df3c4859bceb81627cf764e" Mar 10 23:36:49 crc kubenswrapper[4919]: I0310 23:36:49.480690 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:36:49 crc kubenswrapper[4919]: E0310 23:36:49.481214 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:37:04 crc kubenswrapper[4919]: I0310 23:37:04.480255 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:37:04 crc kubenswrapper[4919]: E0310 23:37:04.481150 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:37:16 crc kubenswrapper[4919]: I0310 23:37:16.480002 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:37:16 crc kubenswrapper[4919]: E0310 23:37:16.480741 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:37:27 crc kubenswrapper[4919]: I0310 23:37:27.480784 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:37:27 crc kubenswrapper[4919]: E0310 23:37:27.481528 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:37:40 crc kubenswrapper[4919]: I0310 23:37:40.480604 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:37:40 crc kubenswrapper[4919]: E0310 23:37:40.481703 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:37:51 crc kubenswrapper[4919]: I0310 23:37:51.480309 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:37:51 crc kubenswrapper[4919]: E0310 23:37:51.480970 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.174231 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553098-96g44"] Mar 10 23:38:00 crc kubenswrapper[4919]: E0310 23:38:00.175366 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerName="extract-content" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.175388 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerName="extract-content" Mar 10 23:38:00 crc kubenswrapper[4919]: E0310 23:38:00.175492 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerName="extract-utilities" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.175521 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerName="extract-utilities" Mar 10 23:38:00 crc kubenswrapper[4919]: E0310 23:38:00.175552 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488c1761-7e4e-442e-b645-e3869754207f" containerName="registry-server" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.175570 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="488c1761-7e4e-442e-b645-e3869754207f" containerName="registry-server" Mar 10 23:38:00 crc kubenswrapper[4919]: E0310 23:38:00.175592 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488c1761-7e4e-442e-b645-e3869754207f" containerName="extract-content" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.175603 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="488c1761-7e4e-442e-b645-e3869754207f" containerName="extract-content" Mar 10 23:38:00 crc kubenswrapper[4919]: E0310 23:38:00.175630 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488c1761-7e4e-442e-b645-e3869754207f" containerName="extract-utilities" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.175642 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="488c1761-7e4e-442e-b645-e3869754207f" containerName="extract-utilities" Mar 10 23:38:00 crc kubenswrapper[4919]: E0310 23:38:00.175668 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerName="registry-server" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.175679 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerName="registry-server" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.175964 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ca7d60-531e-4d13-b71b-b3b9301d1aed" containerName="registry-server" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.176005 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="488c1761-7e4e-442e-b645-e3869754207f" containerName="registry-server" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.176917 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553098-96g44" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.181180 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.183709 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.185004 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.197474 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553098-96g44"] Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.249644 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jj2w\" (UniqueName: \"kubernetes.io/projected/dbf95132-2ab1-4884-8911-64d13302fb1b-kube-api-access-4jj2w\") pod \"auto-csr-approver-29553098-96g44\" (UID: \"dbf95132-2ab1-4884-8911-64d13302fb1b\") " pod="openshift-infra/auto-csr-approver-29553098-96g44" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.351623 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jj2w\" (UniqueName: \"kubernetes.io/projected/dbf95132-2ab1-4884-8911-64d13302fb1b-kube-api-access-4jj2w\") pod \"auto-csr-approver-29553098-96g44\" (UID: \"dbf95132-2ab1-4884-8911-64d13302fb1b\") " pod="openshift-infra/auto-csr-approver-29553098-96g44" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.370148 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jj2w\" (UniqueName: \"kubernetes.io/projected/dbf95132-2ab1-4884-8911-64d13302fb1b-kube-api-access-4jj2w\") pod \"auto-csr-approver-29553098-96g44\" (UID: \"dbf95132-2ab1-4884-8911-64d13302fb1b\") " pod="openshift-infra/auto-csr-approver-29553098-96g44" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.474281 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vgxnh"] Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.476427 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.493508 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgxnh"] Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.508140 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553098-96g44" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.556137 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-utilities\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.556943 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4cng\" (UniqueName: \"kubernetes.io/projected/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-kube-api-access-n4cng\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.557057 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-catalog-content\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.659879 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-utilities\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.660325 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-utilities\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.660850 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4cng\" (UniqueName: \"kubernetes.io/projected/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-kube-api-access-n4cng\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.660917 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-catalog-content\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.661675 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-catalog-content\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.707826 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4cng\" (UniqueName: \"kubernetes.io/projected/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-kube-api-access-n4cng\") pod \"redhat-operators-vgxnh\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.771260 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553098-96g44"] Mar 10 23:38:00 crc kubenswrapper[4919]: I0310 23:38:00.809480 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:01 crc kubenswrapper[4919]: I0310 23:38:01.267282 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgxnh"] Mar 10 23:38:01 crc kubenswrapper[4919]: I0310 23:38:01.436356 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553098-96g44" event={"ID":"dbf95132-2ab1-4884-8911-64d13302fb1b","Type":"ContainerStarted","Data":"c4e51068c072314d7bf0343fe6bf3dfcae1ef971d5b0a63a68e9f16c84feb459"} Mar 10 23:38:01 crc kubenswrapper[4919]: I0310 23:38:01.437735 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgxnh" event={"ID":"bdc9d813-fbe9-4381-8d5d-b30215ce98ff","Type":"ContainerStarted","Data":"2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7"} Mar 10 23:38:01 crc kubenswrapper[4919]: I0310 23:38:01.437783 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgxnh" event={"ID":"bdc9d813-fbe9-4381-8d5d-b30215ce98ff","Type":"ContainerStarted","Data":"ee1ceb5a24b7e852d1829ceb3def88adca2b539aa1d1a823c3be433fcfe69e88"} Mar 10 23:38:02 crc kubenswrapper[4919]: I0310 23:38:02.445939 4919 generic.go:334] "Generic (PLEG): container finished" podID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerID="2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7" exitCode=0 Mar 10 23:38:02 crc kubenswrapper[4919]: I0310 23:38:02.445995 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgxnh" event={"ID":"bdc9d813-fbe9-4381-8d5d-b30215ce98ff","Type":"ContainerDied","Data":"2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7"} Mar 10 23:38:02 crc kubenswrapper[4919]: I0310 23:38:02.447830 4919 generic.go:334] "Generic (PLEG): container finished" podID="dbf95132-2ab1-4884-8911-64d13302fb1b" containerID="b3249d607beec9c6d4c28c761bc18363f0bafc78bec3d8ff65101e81f8e2af57" exitCode=0 Mar 10 23:38:02 crc kubenswrapper[4919]: I0310 23:38:02.447866 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553098-96g44" event={"ID":"dbf95132-2ab1-4884-8911-64d13302fb1b","Type":"ContainerDied","Data":"b3249d607beec9c6d4c28c761bc18363f0bafc78bec3d8ff65101e81f8e2af57"} Mar 10 23:38:03 crc kubenswrapper[4919]: I0310 23:38:03.458249 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgxnh" event={"ID":"bdc9d813-fbe9-4381-8d5d-b30215ce98ff","Type":"ContainerStarted","Data":"b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f"} Mar 10 23:38:03 crc kubenswrapper[4919]: I0310 23:38:03.858588 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553098-96g44" Mar 10 23:38:03 crc kubenswrapper[4919]: I0310 23:38:03.927034 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jj2w\" (UniqueName: \"kubernetes.io/projected/dbf95132-2ab1-4884-8911-64d13302fb1b-kube-api-access-4jj2w\") pod \"dbf95132-2ab1-4884-8911-64d13302fb1b\" (UID: \"dbf95132-2ab1-4884-8911-64d13302fb1b\") " Mar 10 23:38:03 crc kubenswrapper[4919]: I0310 23:38:03.936098 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf95132-2ab1-4884-8911-64d13302fb1b-kube-api-access-4jj2w" (OuterVolumeSpecName: "kube-api-access-4jj2w") pod "dbf95132-2ab1-4884-8911-64d13302fb1b" (UID: "dbf95132-2ab1-4884-8911-64d13302fb1b"). InnerVolumeSpecName "kube-api-access-4jj2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:38:04 crc kubenswrapper[4919]: I0310 23:38:04.029176 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jj2w\" (UniqueName: \"kubernetes.io/projected/dbf95132-2ab1-4884-8911-64d13302fb1b-kube-api-access-4jj2w\") on node \"crc\" DevicePath \"\"" Mar 10 23:38:04 crc kubenswrapper[4919]: I0310 23:38:04.470859 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553098-96g44" Mar 10 23:38:04 crc kubenswrapper[4919]: I0310 23:38:04.470869 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553098-96g44" event={"ID":"dbf95132-2ab1-4884-8911-64d13302fb1b","Type":"ContainerDied","Data":"c4e51068c072314d7bf0343fe6bf3dfcae1ef971d5b0a63a68e9f16c84feb459"} Mar 10 23:38:04 crc kubenswrapper[4919]: I0310 23:38:04.470927 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4e51068c072314d7bf0343fe6bf3dfcae1ef971d5b0a63a68e9f16c84feb459" Mar 10 23:38:04 crc kubenswrapper[4919]: I0310 23:38:04.473247 4919 generic.go:334] "Generic (PLEG): container finished" podID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerID="b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f" exitCode=0 Mar 10 23:38:04 crc kubenswrapper[4919]: I0310 23:38:04.473299 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgxnh" event={"ID":"bdc9d813-fbe9-4381-8d5d-b30215ce98ff","Type":"ContainerDied","Data":"b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f"} Mar 10 23:38:04 crc kubenswrapper[4919]: I0310 23:38:04.937343 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553092-fdlds"] Mar 10 23:38:04 crc kubenswrapper[4919]: I0310 23:38:04.947765 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553092-fdlds"] Mar 10 23:38:05 crc kubenswrapper[4919]: I0310 23:38:05.489421 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a92de7-baac-4521-bf31-cb416ae4bd54" path="/var/lib/kubelet/pods/74a92de7-baac-4521-bf31-cb416ae4bd54/volumes" Mar 10 23:38:05 crc kubenswrapper[4919]: I0310 23:38:05.490478 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgxnh" event={"ID":"bdc9d813-fbe9-4381-8d5d-b30215ce98ff","Type":"ContainerStarted","Data":"a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a"} Mar 10 23:38:05 crc kubenswrapper[4919]: I0310 23:38:05.513158 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vgxnh" podStartSLOduration=2.906564933 podStartE2EDuration="5.513139308s" podCreationTimestamp="2026-03-10 23:38:00 +0000 UTC" firstStartedPulling="2026-03-10 23:38:02.448380413 +0000 UTC m=+6469.690261021" lastFinishedPulling="2026-03-10 23:38:05.054954788 +0000 UTC m=+6472.296835396" observedRunningTime="2026-03-10 23:38:05.509335533 +0000 UTC m=+6472.751216141" watchObservedRunningTime="2026-03-10 23:38:05.513139308 +0000 UTC m=+6472.755019916" Mar 10 23:38:06 crc kubenswrapper[4919]: I0310 23:38:06.480614 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:38:06 crc kubenswrapper[4919]: E0310 23:38:06.481156 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:38:10 crc kubenswrapper[4919]: I0310 23:38:10.810059 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:10 crc kubenswrapper[4919]: I0310 23:38:10.810418 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:11 crc kubenswrapper[4919]: I0310 23:38:11.851035 4919 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vgxnh" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="registry-server" probeResult="failure" output=< Mar 10 23:38:11 crc kubenswrapper[4919]: timeout: failed to connect service ":50051" within 1s Mar 10 23:38:11 crc kubenswrapper[4919]: > Mar 10 23:38:19 crc kubenswrapper[4919]: I0310 23:38:19.482447 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:38:19 crc kubenswrapper[4919]: E0310 23:38:19.483222 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:38:20 crc kubenswrapper[4919]: I0310 23:38:20.860518 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:20 crc kubenswrapper[4919]: I0310 23:38:20.905691 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:21 crc kubenswrapper[4919]: I0310 23:38:21.097812 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgxnh"] Mar 10 23:38:22 crc kubenswrapper[4919]: I0310 23:38:22.632139 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vgxnh" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="registry-server" containerID="cri-o://a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a" gracePeriod=2 Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.092672 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.281583 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-utilities\") pod \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.281649 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4cng\" (UniqueName: \"kubernetes.io/projected/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-kube-api-access-n4cng\") pod \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.281818 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-catalog-content\") pod \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\" (UID: \"bdc9d813-fbe9-4381-8d5d-b30215ce98ff\") " Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.284109 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-utilities" (OuterVolumeSpecName: "utilities") pod "bdc9d813-fbe9-4381-8d5d-b30215ce98ff" (UID: "bdc9d813-fbe9-4381-8d5d-b30215ce98ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.289134 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-kube-api-access-n4cng" (OuterVolumeSpecName: "kube-api-access-n4cng") pod "bdc9d813-fbe9-4381-8d5d-b30215ce98ff" (UID: "bdc9d813-fbe9-4381-8d5d-b30215ce98ff"). InnerVolumeSpecName "kube-api-access-n4cng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.384930 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4cng\" (UniqueName: \"kubernetes.io/projected/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-kube-api-access-n4cng\") on node \"crc\" DevicePath \"\"" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.384971 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.410127 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdc9d813-fbe9-4381-8d5d-b30215ce98ff" (UID: "bdc9d813-fbe9-4381-8d5d-b30215ce98ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.486968 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc9d813-fbe9-4381-8d5d-b30215ce98ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.646551 4919 generic.go:334] "Generic (PLEG): container finished" podID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerID="a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a" exitCode=0 Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.646602 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgxnh" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.646612 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgxnh" event={"ID":"bdc9d813-fbe9-4381-8d5d-b30215ce98ff","Type":"ContainerDied","Data":"a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a"} Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.646652 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgxnh" event={"ID":"bdc9d813-fbe9-4381-8d5d-b30215ce98ff","Type":"ContainerDied","Data":"ee1ceb5a24b7e852d1829ceb3def88adca2b539aa1d1a823c3be433fcfe69e88"} Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.646682 4919 scope.go:117] "RemoveContainer" containerID="a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.675590 4919 scope.go:117] "RemoveContainer" containerID="b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.680191 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgxnh"] Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.689377 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vgxnh"] Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.720751 4919 scope.go:117] "RemoveContainer" containerID="2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.747933 4919 scope.go:117] "RemoveContainer" containerID="a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a" Mar 10 23:38:23 crc kubenswrapper[4919]: E0310 23:38:23.748697 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a\": container with ID starting with a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a not found: ID does not exist" containerID="a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.748761 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a"} err="failed to get container status \"a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a\": rpc error: code = NotFound desc = could not find container \"a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a\": container with ID starting with a2ae0fbe2d82b6cac2582e1587a392e4268f19273bb26bdac4471e86cae90a5a not found: ID does not exist" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.748801 4919 scope.go:117] "RemoveContainer" containerID="b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f" Mar 10 23:38:23 crc kubenswrapper[4919]: E0310 23:38:23.749141 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f\": container with ID starting with b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f not found: ID does not exist" containerID="b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.749270 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f"} err="failed to get container status \"b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f\": rpc error: code = NotFound desc = could not find container \"b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f\": container with ID starting with b352b35e3e360c6b5d279296e05e900cd217405bc7af72a01e100d5302317c9f not found: ID does not exist" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.749377 4919 scope.go:117] "RemoveContainer" containerID="2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7" Mar 10 23:38:23 crc kubenswrapper[4919]: E0310 23:38:23.749956 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7\": container with ID starting with 2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7 not found: ID does not exist" containerID="2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7" Mar 10 23:38:23 crc kubenswrapper[4919]: I0310 23:38:23.749990 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7"} err="failed to get container status \"2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7\": rpc error: code = NotFound desc = could not find container \"2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7\": container with ID starting with 2cab1862450427a992777b02e30ccb6d8c693c19c7b0df62c2f7a70c8ac04ff7 not found: ID does not exist" Mar 10 23:38:25 crc kubenswrapper[4919]: I0310 23:38:25.496785 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" path="/var/lib/kubelet/pods/bdc9d813-fbe9-4381-8d5d-b30215ce98ff/volumes" Mar 10 23:38:33 crc kubenswrapper[4919]: I0310 23:38:33.492303 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:38:33 crc kubenswrapper[4919]: E0310 23:38:33.496050 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:38:45 crc kubenswrapper[4919]: I0310 23:38:45.479750 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:38:45 crc kubenswrapper[4919]: E0310 23:38:45.480304 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:38:48 crc kubenswrapper[4919]: I0310 23:38:48.651090 4919 scope.go:117] "RemoveContainer" containerID="ca04b3b9e1245c4edc564fd540ff1da04cf8356d5a19f30ecc12ae162c0b54c8" Mar 10 23:38:48 crc kubenswrapper[4919]: I0310 23:38:48.722858 4919 scope.go:117] "RemoveContainer" containerID="91ec06a5d6ade2d97b84f37e80bacc706b634dab3a47b39677c9ba2b287a7917" Mar 10 23:38:48 crc kubenswrapper[4919]: I0310 23:38:48.745006 4919 scope.go:117] "RemoveContainer" containerID="d56427461c28a8b6bba6a9f42f766317b3c91785dd1ebb9c67a43871bb2b5013" Mar 10 23:38:48 crc kubenswrapper[4919]: I0310 23:38:48.809654 4919 scope.go:117] "RemoveContainer" containerID="fd142aaa573d119ac6a2889b7e671bf56e9e71d7fc3b14f60149e3d83b44e1fe" Mar 10 23:38:56 crc kubenswrapper[4919]: I0310 23:38:56.480921 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:38:56 crc kubenswrapper[4919]: E0310 23:38:56.481916 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:39:08 crc kubenswrapper[4919]: I0310 23:39:08.480167 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:39:08 crc kubenswrapper[4919]: E0310 23:39:08.483064 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:39:20 crc kubenswrapper[4919]: I0310 23:39:20.480763 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:39:20 crc kubenswrapper[4919]: E0310 23:39:20.482889 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:39:31 crc kubenswrapper[4919]: I0310 23:39:31.480198 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:39:31 crc kubenswrapper[4919]: E0310 23:39:31.480846 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:39:42 crc kubenswrapper[4919]: I0310 23:39:42.480638 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:39:42 crc kubenswrapper[4919]: E0310 23:39:42.481829 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:39:53 crc kubenswrapper[4919]: I0310 23:39:53.495510 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:39:53 crc kubenswrapper[4919]: E0310 23:39:53.496840 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.135562 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553100-gztqm"] Mar 10 23:40:00 crc kubenswrapper[4919]: E0310 23:40:00.136243 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="extract-utilities" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.136254 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="extract-utilities" Mar 10 23:40:00 crc kubenswrapper[4919]: E0310 23:40:00.136271 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="registry-server" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.136277 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="registry-server" Mar 10 23:40:00 crc kubenswrapper[4919]: E0310 23:40:00.136287 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf95132-2ab1-4884-8911-64d13302fb1b" containerName="oc" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.136294 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf95132-2ab1-4884-8911-64d13302fb1b" containerName="oc" Mar 10 23:40:00 crc kubenswrapper[4919]: E0310 23:40:00.136306 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="extract-content" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.136312 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="extract-content" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.136469 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf95132-2ab1-4884-8911-64d13302fb1b" containerName="oc" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.136492 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc9d813-fbe9-4381-8d5d-b30215ce98ff" containerName="registry-server" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.136960 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553100-gztqm" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.138994 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.139217 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.139343 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.154254 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553100-gztqm"] Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.297553 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq96j\" (UniqueName: \"kubernetes.io/projected/ee6990c8-a58b-409e-82f8-f4b68997040d-kube-api-access-nq96j\") pod \"auto-csr-approver-29553100-gztqm\" (UID: \"ee6990c8-a58b-409e-82f8-f4b68997040d\") " pod="openshift-infra/auto-csr-approver-29553100-gztqm" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.399281 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq96j\" (UniqueName: \"kubernetes.io/projected/ee6990c8-a58b-409e-82f8-f4b68997040d-kube-api-access-nq96j\") pod \"auto-csr-approver-29553100-gztqm\" (UID: \"ee6990c8-a58b-409e-82f8-f4b68997040d\") " pod="openshift-infra/auto-csr-approver-29553100-gztqm" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.421189 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq96j\" (UniqueName: \"kubernetes.io/projected/ee6990c8-a58b-409e-82f8-f4b68997040d-kube-api-access-nq96j\") pod \"auto-csr-approver-29553100-gztqm\" (UID: \"ee6990c8-a58b-409e-82f8-f4b68997040d\") " pod="openshift-infra/auto-csr-approver-29553100-gztqm" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.455733 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553100-gztqm" Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.948990 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553100-gztqm"] Mar 10 23:40:00 crc kubenswrapper[4919]: I0310 23:40:00.968919 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 23:40:01 crc kubenswrapper[4919]: I0310 23:40:01.622170 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553100-gztqm" event={"ID":"ee6990c8-a58b-409e-82f8-f4b68997040d","Type":"ContainerStarted","Data":"57b9a37f5dddae03d4e50d5cb4915d4f79c75cc974b508ab7d7f9c500342663f"} Mar 10 23:40:02 crc kubenswrapper[4919]: I0310 23:40:02.630749 4919 generic.go:334] "Generic (PLEG): container finished" podID="ee6990c8-a58b-409e-82f8-f4b68997040d" containerID="e317c9bf1ce4c45a750933e3dc231c4a1cf4bb881b0792d93d64858f60aee5dc" exitCode=0 Mar 10 23:40:02 crc kubenswrapper[4919]: I0310 23:40:02.630817 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553100-gztqm" event={"ID":"ee6990c8-a58b-409e-82f8-f4b68997040d","Type":"ContainerDied","Data":"e317c9bf1ce4c45a750933e3dc231c4a1cf4bb881b0792d93d64858f60aee5dc"} Mar 10 23:40:04 crc kubenswrapper[4919]: I0310 23:40:04.019803 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553100-gztqm" Mar 10 23:40:04 crc kubenswrapper[4919]: I0310 23:40:04.076930 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq96j\" (UniqueName: \"kubernetes.io/projected/ee6990c8-a58b-409e-82f8-f4b68997040d-kube-api-access-nq96j\") pod \"ee6990c8-a58b-409e-82f8-f4b68997040d\" (UID: \"ee6990c8-a58b-409e-82f8-f4b68997040d\") " Mar 10 23:40:04 crc kubenswrapper[4919]: I0310 23:40:04.084155 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6990c8-a58b-409e-82f8-f4b68997040d-kube-api-access-nq96j" (OuterVolumeSpecName: "kube-api-access-nq96j") pod "ee6990c8-a58b-409e-82f8-f4b68997040d" (UID: "ee6990c8-a58b-409e-82f8-f4b68997040d"). InnerVolumeSpecName "kube-api-access-nq96j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:40:04 crc kubenswrapper[4919]: I0310 23:40:04.178907 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq96j\" (UniqueName: \"kubernetes.io/projected/ee6990c8-a58b-409e-82f8-f4b68997040d-kube-api-access-nq96j\") on node \"crc\" DevicePath \"\"" Mar 10 23:40:04 crc kubenswrapper[4919]: I0310 23:40:04.651766 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553100-gztqm" event={"ID":"ee6990c8-a58b-409e-82f8-f4b68997040d","Type":"ContainerDied","Data":"57b9a37f5dddae03d4e50d5cb4915d4f79c75cc974b508ab7d7f9c500342663f"} Mar 10 23:40:04 crc kubenswrapper[4919]: I0310 23:40:04.651995 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b9a37f5dddae03d4e50d5cb4915d4f79c75cc974b508ab7d7f9c500342663f" Mar 10 23:40:04 crc kubenswrapper[4919]: I0310 23:40:04.651823 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553100-gztqm" Mar 10 23:40:05 crc kubenswrapper[4919]: I0310 23:40:05.103937 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553094-tqc5l"] Mar 10 23:40:05 crc kubenswrapper[4919]: I0310 23:40:05.114927 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553094-tqc5l"] Mar 10 23:40:05 crc kubenswrapper[4919]: I0310 23:40:05.493419 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c095361-8631-4a6c-8a04-a9f4a735880a" path="/var/lib/kubelet/pods/8c095361-8631-4a6c-8a04-a9f4a735880a/volumes" Mar 10 23:40:06 crc kubenswrapper[4919]: I0310 23:40:06.480061 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:40:06 crc kubenswrapper[4919]: E0310 23:40:06.480669 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:40:17 crc kubenswrapper[4919]: I0310 23:40:17.480203 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:40:17 crc kubenswrapper[4919]: E0310 23:40:17.481119 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:40:31 crc kubenswrapper[4919]: I0310 23:40:31.480061 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:40:31 crc kubenswrapper[4919]: E0310 23:40:31.480655 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:40:43 crc kubenswrapper[4919]: I0310 23:40:43.489862 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:40:43 crc kubenswrapper[4919]: E0310 23:40:43.492857 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:40:48 crc kubenswrapper[4919]: I0310 23:40:48.908819 4919 scope.go:117] "RemoveContainer" containerID="3cadf2e371e8f81ed5b8ee408bfe8cb4fb6c1841dde8bc0a7c63e38a481a7c0b" Mar 10 23:40:55 crc kubenswrapper[4919]: I0310 23:40:55.480669 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:40:55 crc kubenswrapper[4919]: E0310 23:40:55.481886 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.150560 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-56pqp/must-gather-7hlmv"] Mar 10 23:41:09 crc kubenswrapper[4919]: E0310 23:41:09.152260 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6990c8-a58b-409e-82f8-f4b68997040d" containerName="oc" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.152373 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6990c8-a58b-409e-82f8-f4b68997040d" containerName="oc" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.152627 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6990c8-a58b-409e-82f8-f4b68997040d" containerName="oc" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.153487 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.157085 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-56pqp"/"default-dockercfg-6h2mp" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.157322 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-56pqp"/"openshift-service-ca.crt" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.157930 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-56pqp"/"kube-root-ca.crt" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.175115 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-56pqp/must-gather-7hlmv"] Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.281263 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kbc5\" (UniqueName: \"kubernetes.io/projected/1c06e98a-a64a-424e-b669-1d7815c55b6d-kube-api-access-5kbc5\") pod \"must-gather-7hlmv\" (UID: \"1c06e98a-a64a-424e-b669-1d7815c55b6d\") " pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.281644 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c06e98a-a64a-424e-b669-1d7815c55b6d-must-gather-output\") pod \"must-gather-7hlmv\" (UID: \"1c06e98a-a64a-424e-b669-1d7815c55b6d\") " pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.383658 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c06e98a-a64a-424e-b669-1d7815c55b6d-must-gather-output\") pod \"must-gather-7hlmv\" (UID: \"1c06e98a-a64a-424e-b669-1d7815c55b6d\") " pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.383774 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kbc5\" (UniqueName: \"kubernetes.io/projected/1c06e98a-a64a-424e-b669-1d7815c55b6d-kube-api-access-5kbc5\") pod \"must-gather-7hlmv\" (UID: \"1c06e98a-a64a-424e-b669-1d7815c55b6d\") " pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.384254 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c06e98a-a64a-424e-b669-1d7815c55b6d-must-gather-output\") pod \"must-gather-7hlmv\" (UID: \"1c06e98a-a64a-424e-b669-1d7815c55b6d\") " pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.405288 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kbc5\" (UniqueName: \"kubernetes.io/projected/1c06e98a-a64a-424e-b669-1d7815c55b6d-kube-api-access-5kbc5\") pod \"must-gather-7hlmv\" (UID: \"1c06e98a-a64a-424e-b669-1d7815c55b6d\") " pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.486008 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:41:09 crc kubenswrapper[4919]: I0310 23:41:09.897413 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-56pqp/must-gather-7hlmv"] Mar 10 23:41:10 crc kubenswrapper[4919]: I0310 23:41:10.301672 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/must-gather-7hlmv" event={"ID":"1c06e98a-a64a-424e-b669-1d7815c55b6d","Type":"ContainerStarted","Data":"cd69048ab3b7cd210c0d03225be36cb4f8dd4713007720584ed851ada260ce10"} Mar 10 23:41:10 crc kubenswrapper[4919]: I0310 23:41:10.481190 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:41:11 crc kubenswrapper[4919]: I0310 23:41:11.313622 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"e18b7db8c70c24109d95f2371d5b652db18ddf5ac55b9da74904e0d7d46a12e3"} Mar 10 23:41:17 crc kubenswrapper[4919]: I0310 23:41:17.367516 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/must-gather-7hlmv" event={"ID":"1c06e98a-a64a-424e-b669-1d7815c55b6d","Type":"ContainerStarted","Data":"dadb8fa5b9ff0b602f79f6261e442a2f7a2055d6597718df51d88ec0983d81c9"} Mar 10 23:41:17 crc kubenswrapper[4919]: I0310 23:41:17.368547 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/must-gather-7hlmv" event={"ID":"1c06e98a-a64a-424e-b669-1d7815c55b6d","Type":"ContainerStarted","Data":"8f34da97e0ebc3975065df1fc1daff70723dcf6dfd96b765745b86a1a50bbd16"} Mar 10 23:41:17 crc kubenswrapper[4919]: I0310 23:41:17.396481 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-56pqp/must-gather-7hlmv" podStartSLOduration=2.073034357 podStartE2EDuration="8.396453917s" podCreationTimestamp="2026-03-10 23:41:09 +0000 UTC" firstStartedPulling="2026-03-10 23:41:09.904342244 +0000 UTC m=+6657.146222882" lastFinishedPulling="2026-03-10 23:41:16.227761834 +0000 UTC m=+6663.469642442" observedRunningTime="2026-03-10 23:41:17.383143905 +0000 UTC m=+6664.625024533" watchObservedRunningTime="2026-03-10 23:41:17.396453917 +0000 UTC m=+6664.638334535" Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.308023 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-56pqp/crc-debug-jsk5w"] Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.309104 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.415167 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8khwk\" (UniqueName: \"kubernetes.io/projected/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-kube-api-access-8khwk\") pod \"crc-debug-jsk5w\" (UID: \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\") " pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.415647 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-host\") pod \"crc-debug-jsk5w\" (UID: \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\") " pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.517373 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-host\") pod \"crc-debug-jsk5w\" (UID: \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\") " pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.517547 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-host\") pod \"crc-debug-jsk5w\" (UID: \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\") " pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.517637 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8khwk\" (UniqueName: \"kubernetes.io/projected/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-kube-api-access-8khwk\") pod \"crc-debug-jsk5w\" (UID: \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\") " pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.538852 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8khwk\" (UniqueName: \"kubernetes.io/projected/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-kube-api-access-8khwk\") pod \"crc-debug-jsk5w\" (UID: \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\") " pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:19 crc kubenswrapper[4919]: I0310 23:41:19.634064 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:20 crc kubenswrapper[4919]: I0310 23:41:20.395509 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/crc-debug-jsk5w" event={"ID":"1959b02c-dcd6-44ec-98c2-fa39617ea8a2","Type":"ContainerStarted","Data":"b333dd563989fa8011ecbf83c37f3311e318304287a0d7739eb00ecfaf831fc3"} Mar 10 23:41:31 crc kubenswrapper[4919]: I0310 23:41:31.492160 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/crc-debug-jsk5w" event={"ID":"1959b02c-dcd6-44ec-98c2-fa39617ea8a2","Type":"ContainerStarted","Data":"6759152db79246c8f366a301c0862779647c325434096936d8dc75c2382c30ec"} Mar 10 23:41:31 crc kubenswrapper[4919]: I0310 23:41:31.517287 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-56pqp/crc-debug-jsk5w" podStartSLOduration=1.809740368 podStartE2EDuration="12.517268248s" podCreationTimestamp="2026-03-10 23:41:19 +0000 UTC" firstStartedPulling="2026-03-10 23:41:19.672137092 +0000 UTC m=+6666.914017700" lastFinishedPulling="2026-03-10 23:41:30.379664962 +0000 UTC m=+6677.621545580" observedRunningTime="2026-03-10 23:41:31.512464308 +0000 UTC m=+6678.754344946" watchObservedRunningTime="2026-03-10 23:41:31.517268248 +0000 UTC m=+6678.759148856" Mar 10 23:41:46 crc kubenswrapper[4919]: I0310 23:41:46.612272 4919 generic.go:334] "Generic (PLEG): container finished" podID="1959b02c-dcd6-44ec-98c2-fa39617ea8a2" containerID="6759152db79246c8f366a301c0862779647c325434096936d8dc75c2382c30ec" exitCode=0 Mar 10 23:41:46 crc kubenswrapper[4919]: I0310 23:41:46.612474 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/crc-debug-jsk5w" event={"ID":"1959b02c-dcd6-44ec-98c2-fa39617ea8a2","Type":"ContainerDied","Data":"6759152db79246c8f366a301c0862779647c325434096936d8dc75c2382c30ec"} Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.724017 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.769727 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-56pqp/crc-debug-jsk5w"] Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.779671 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-56pqp/crc-debug-jsk5w"] Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.891646 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8khwk\" (UniqueName: \"kubernetes.io/projected/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-kube-api-access-8khwk\") pod \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\" (UID: \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\") " Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.891713 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-host\") pod \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\" (UID: \"1959b02c-dcd6-44ec-98c2-fa39617ea8a2\") " Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.891882 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-host" (OuterVolumeSpecName: "host") pod "1959b02c-dcd6-44ec-98c2-fa39617ea8a2" (UID: "1959b02c-dcd6-44ec-98c2-fa39617ea8a2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.892235 4919 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-host\") on node \"crc\" DevicePath \"\"" Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.898689 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-kube-api-access-8khwk" (OuterVolumeSpecName: "kube-api-access-8khwk") pod "1959b02c-dcd6-44ec-98c2-fa39617ea8a2" (UID: "1959b02c-dcd6-44ec-98c2-fa39617ea8a2"). InnerVolumeSpecName "kube-api-access-8khwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:41:47 crc kubenswrapper[4919]: I0310 23:41:47.993933 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8khwk\" (UniqueName: \"kubernetes.io/projected/1959b02c-dcd6-44ec-98c2-fa39617ea8a2-kube-api-access-8khwk\") on node \"crc\" DevicePath \"\"" Mar 10 23:41:48 crc kubenswrapper[4919]: I0310 23:41:48.631763 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b333dd563989fa8011ecbf83c37f3311e318304287a0d7739eb00ecfaf831fc3" Mar 10 23:41:48 crc kubenswrapper[4919]: I0310 23:41:48.631838 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/crc-debug-jsk5w" Mar 10 23:41:48 crc kubenswrapper[4919]: I0310 23:41:48.990214 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-56pqp/crc-debug-fp4qw"] Mar 10 23:41:48 crc kubenswrapper[4919]: E0310 23:41:48.990583 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1959b02c-dcd6-44ec-98c2-fa39617ea8a2" containerName="container-00" Mar 10 23:41:48 crc kubenswrapper[4919]: I0310 23:41:48.990596 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1959b02c-dcd6-44ec-98c2-fa39617ea8a2" containerName="container-00" Mar 10 23:41:48 crc kubenswrapper[4919]: I0310 23:41:48.990753 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1959b02c-dcd6-44ec-98c2-fa39617ea8a2" containerName="container-00" Mar 10 23:41:48 crc kubenswrapper[4919]: I0310 23:41:48.991272 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.020057 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbf7n\" (UniqueName: \"kubernetes.io/projected/fd94f395-3bb8-4521-80d7-9fe48813f94f-kube-api-access-bbf7n\") pod \"crc-debug-fp4qw\" (UID: \"fd94f395-3bb8-4521-80d7-9fe48813f94f\") " pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.020265 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd94f395-3bb8-4521-80d7-9fe48813f94f-host\") pod \"crc-debug-fp4qw\" (UID: \"fd94f395-3bb8-4521-80d7-9fe48813f94f\") " pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.122801 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbf7n\" (UniqueName: \"kubernetes.io/projected/fd94f395-3bb8-4521-80d7-9fe48813f94f-kube-api-access-bbf7n\") pod \"crc-debug-fp4qw\" (UID: \"fd94f395-3bb8-4521-80d7-9fe48813f94f\") " pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.123053 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd94f395-3bb8-4521-80d7-9fe48813f94f-host\") pod \"crc-debug-fp4qw\" (UID: \"fd94f395-3bb8-4521-80d7-9fe48813f94f\") " pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.123280 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd94f395-3bb8-4521-80d7-9fe48813f94f-host\") pod \"crc-debug-fp4qw\" (UID: \"fd94f395-3bb8-4521-80d7-9fe48813f94f\") " pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.157161 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbf7n\" (UniqueName: \"kubernetes.io/projected/fd94f395-3bb8-4521-80d7-9fe48813f94f-kube-api-access-bbf7n\") pod \"crc-debug-fp4qw\" (UID: \"fd94f395-3bb8-4521-80d7-9fe48813f94f\") " pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.321351 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:49 crc kubenswrapper[4919]: W0310 23:41:49.364967 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd94f395_3bb8_4521_80d7_9fe48813f94f.slice/crio-88c8c8cd2e0751aae2cd7798921d78bc15c7982fe6878b698bae22ddaeac9f56 WatchSource:0}: Error finding container 88c8c8cd2e0751aae2cd7798921d78bc15c7982fe6878b698bae22ddaeac9f56: Status 404 returned error can't find the container with id 88c8c8cd2e0751aae2cd7798921d78bc15c7982fe6878b698bae22ddaeac9f56 Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.489914 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1959b02c-dcd6-44ec-98c2-fa39617ea8a2" path="/var/lib/kubelet/pods/1959b02c-dcd6-44ec-98c2-fa39617ea8a2/volumes" Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.640292 4919 generic.go:334] "Generic (PLEG): container finished" podID="fd94f395-3bb8-4521-80d7-9fe48813f94f" containerID="1548f8776e1706bbc6b8a4918042953fc0b381ed4db455c05801372028e44566" exitCode=1 Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.640345 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/crc-debug-fp4qw" event={"ID":"fd94f395-3bb8-4521-80d7-9fe48813f94f","Type":"ContainerDied","Data":"1548f8776e1706bbc6b8a4918042953fc0b381ed4db455c05801372028e44566"} Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.640383 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/crc-debug-fp4qw" event={"ID":"fd94f395-3bb8-4521-80d7-9fe48813f94f","Type":"ContainerStarted","Data":"88c8c8cd2e0751aae2cd7798921d78bc15c7982fe6878b698bae22ddaeac9f56"} Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.684177 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-56pqp/crc-debug-fp4qw"] Mar 10 23:41:49 crc kubenswrapper[4919]: I0310 23:41:49.690890 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-56pqp/crc-debug-fp4qw"] Mar 10 23:41:50 crc kubenswrapper[4919]: I0310 23:41:50.717649 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:41:50 crc kubenswrapper[4919]: I0310 23:41:50.749751 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd94f395-3bb8-4521-80d7-9fe48813f94f-host\") pod \"fd94f395-3bb8-4521-80d7-9fe48813f94f\" (UID: \"fd94f395-3bb8-4521-80d7-9fe48813f94f\") " Mar 10 23:41:50 crc kubenswrapper[4919]: I0310 23:41:50.749879 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd94f395-3bb8-4521-80d7-9fe48813f94f-host" (OuterVolumeSpecName: "host") pod "fd94f395-3bb8-4521-80d7-9fe48813f94f" (UID: "fd94f395-3bb8-4521-80d7-9fe48813f94f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 23:41:50 crc kubenswrapper[4919]: I0310 23:41:50.749969 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbf7n\" (UniqueName: \"kubernetes.io/projected/fd94f395-3bb8-4521-80d7-9fe48813f94f-kube-api-access-bbf7n\") pod \"fd94f395-3bb8-4521-80d7-9fe48813f94f\" (UID: \"fd94f395-3bb8-4521-80d7-9fe48813f94f\") " Mar 10 23:41:50 crc kubenswrapper[4919]: I0310 23:41:50.750457 4919 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd94f395-3bb8-4521-80d7-9fe48813f94f-host\") on node \"crc\" DevicePath \"\"" Mar 10 23:41:50 crc kubenswrapper[4919]: I0310 23:41:50.755112 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd94f395-3bb8-4521-80d7-9fe48813f94f-kube-api-access-bbf7n" (OuterVolumeSpecName: "kube-api-access-bbf7n") pod "fd94f395-3bb8-4521-80d7-9fe48813f94f" (UID: "fd94f395-3bb8-4521-80d7-9fe48813f94f"). InnerVolumeSpecName "kube-api-access-bbf7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:41:50 crc kubenswrapper[4919]: I0310 23:41:50.851856 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbf7n\" (UniqueName: \"kubernetes.io/projected/fd94f395-3bb8-4521-80d7-9fe48813f94f-kube-api-access-bbf7n\") on node \"crc\" DevicePath \"\"" Mar 10 23:41:51 crc kubenswrapper[4919]: I0310 23:41:51.493365 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd94f395-3bb8-4521-80d7-9fe48813f94f" path="/var/lib/kubelet/pods/fd94f395-3bb8-4521-80d7-9fe48813f94f/volumes" Mar 10 23:41:51 crc kubenswrapper[4919]: I0310 23:41:51.658254 4919 scope.go:117] "RemoveContainer" containerID="1548f8776e1706bbc6b8a4918042953fc0b381ed4db455c05801372028e44566" Mar 10 23:41:51 crc kubenswrapper[4919]: I0310 23:41:51.658274 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/crc-debug-fp4qw" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.150358 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553102-cg25b"] Mar 10 23:42:00 crc kubenswrapper[4919]: E0310 23:42:00.151227 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd94f395-3bb8-4521-80d7-9fe48813f94f" containerName="container-00" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.151240 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd94f395-3bb8-4521-80d7-9fe48813f94f" containerName="container-00" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.151421 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd94f395-3bb8-4521-80d7-9fe48813f94f" containerName="container-00" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.151900 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553102-cg25b" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.154268 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.154890 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.155000 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.203099 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553102-cg25b"] Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.225756 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgqj\" (UniqueName: \"kubernetes.io/projected/ac5c6cdf-ce81-4b26-92f3-9131bcdcd530-kube-api-access-gsgqj\") pod \"auto-csr-approver-29553102-cg25b\" (UID: \"ac5c6cdf-ce81-4b26-92f3-9131bcdcd530\") " pod="openshift-infra/auto-csr-approver-29553102-cg25b" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.327248 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgqj\" (UniqueName: \"kubernetes.io/projected/ac5c6cdf-ce81-4b26-92f3-9131bcdcd530-kube-api-access-gsgqj\") pod \"auto-csr-approver-29553102-cg25b\" (UID: \"ac5c6cdf-ce81-4b26-92f3-9131bcdcd530\") " pod="openshift-infra/auto-csr-approver-29553102-cg25b" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.360628 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgqj\" (UniqueName: \"kubernetes.io/projected/ac5c6cdf-ce81-4b26-92f3-9131bcdcd530-kube-api-access-gsgqj\") pod \"auto-csr-approver-29553102-cg25b\" (UID: \"ac5c6cdf-ce81-4b26-92f3-9131bcdcd530\") " pod="openshift-infra/auto-csr-approver-29553102-cg25b" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.478358 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553102-cg25b" Mar 10 23:42:00 crc kubenswrapper[4919]: I0310 23:42:00.817311 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553102-cg25b"] Mar 10 23:42:01 crc kubenswrapper[4919]: I0310 23:42:01.741077 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553102-cg25b" event={"ID":"ac5c6cdf-ce81-4b26-92f3-9131bcdcd530","Type":"ContainerStarted","Data":"4d47d5bddd0f566de04f431f55b5f1ecde02347391b326b8431b6710f2e05b08"} Mar 10 23:42:02 crc kubenswrapper[4919]: I0310 23:42:02.752270 4919 generic.go:334] "Generic (PLEG): container finished" podID="ac5c6cdf-ce81-4b26-92f3-9131bcdcd530" containerID="16d2207425b535a91738629ca4c5c321f46a4159bb611815693f68a0cdd17d10" exitCode=0 Mar 10 23:42:02 crc kubenswrapper[4919]: I0310 23:42:02.752357 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553102-cg25b" event={"ID":"ac5c6cdf-ce81-4b26-92f3-9131bcdcd530","Type":"ContainerDied","Data":"16d2207425b535a91738629ca4c5c321f46a4159bb611815693f68a0cdd17d10"} Mar 10 23:42:04 crc kubenswrapper[4919]: I0310 23:42:04.104088 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553102-cg25b" Mar 10 23:42:04 crc kubenswrapper[4919]: I0310 23:42:04.292333 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsgqj\" (UniqueName: \"kubernetes.io/projected/ac5c6cdf-ce81-4b26-92f3-9131bcdcd530-kube-api-access-gsgqj\") pod \"ac5c6cdf-ce81-4b26-92f3-9131bcdcd530\" (UID: \"ac5c6cdf-ce81-4b26-92f3-9131bcdcd530\") " Mar 10 23:42:04 crc kubenswrapper[4919]: I0310 23:42:04.299031 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5c6cdf-ce81-4b26-92f3-9131bcdcd530-kube-api-access-gsgqj" (OuterVolumeSpecName: "kube-api-access-gsgqj") pod "ac5c6cdf-ce81-4b26-92f3-9131bcdcd530" (UID: "ac5c6cdf-ce81-4b26-92f3-9131bcdcd530"). InnerVolumeSpecName "kube-api-access-gsgqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:42:04 crc kubenswrapper[4919]: I0310 23:42:04.394368 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsgqj\" (UniqueName: \"kubernetes.io/projected/ac5c6cdf-ce81-4b26-92f3-9131bcdcd530-kube-api-access-gsgqj\") on node \"crc\" DevicePath \"\"" Mar 10 23:42:04 crc kubenswrapper[4919]: I0310 23:42:04.770671 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553102-cg25b" event={"ID":"ac5c6cdf-ce81-4b26-92f3-9131bcdcd530","Type":"ContainerDied","Data":"4d47d5bddd0f566de04f431f55b5f1ecde02347391b326b8431b6710f2e05b08"} Mar 10 23:42:04 crc kubenswrapper[4919]: I0310 23:42:04.770725 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d47d5bddd0f566de04f431f55b5f1ecde02347391b326b8431b6710f2e05b08" Mar 10 23:42:04 crc kubenswrapper[4919]: I0310 23:42:04.770774 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553102-cg25b" Mar 10 23:42:05 crc kubenswrapper[4919]: I0310 23:42:05.161238 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553096-2n9ld"] Mar 10 23:42:05 crc kubenswrapper[4919]: I0310 23:42:05.166408 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553096-2n9ld"] Mar 10 23:42:05 crc kubenswrapper[4919]: I0310 23:42:05.489668 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e64a1a-500c-476f-bec9-ad4fa04765b0" path="/var/lib/kubelet/pods/67e64a1a-500c-476f-bec9-ad4fa04765b0/volumes" Mar 10 23:42:11 crc kubenswrapper[4919]: I0310 23:42:11.086936 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66d5956757-c9nr5_0daed46b-e851-4d26-9867-827a7973aece/init/0.log" Mar 10 23:42:11 crc kubenswrapper[4919]: I0310 23:42:11.286345 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66d5956757-c9nr5_0daed46b-e851-4d26-9867-827a7973aece/init/0.log" Mar 10 23:42:11 crc kubenswrapper[4919]: I0310 23:42:11.287732 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66d5956757-c9nr5_0daed46b-e851-4d26-9867-827a7973aece/dnsmasq-dns/0.log" Mar 10 23:42:11 crc kubenswrapper[4919]: I0310 23:42:11.512475 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7c8596f879-gt5t2_b314d88f-85bc-49c0-9090-8f59e1f16982/keystone-api/0.log" Mar 10 23:42:11 crc kubenswrapper[4919]: I0310 23:42:11.559422 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_1cc53092-aa12-4c0a-8de9-d1c9e1bbbc19/adoption/0.log" Mar 10 23:42:11 crc kubenswrapper[4919]: I0310 23:42:11.767020 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6d3d38d6-13a6-4aeb-850c-96c069d15e64/mysql-bootstrap/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.020776 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6d3d38d6-13a6-4aeb-850c-96c069d15e64/galera/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.039978 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6d3d38d6-13a6-4aeb-850c-96c069d15e64/mysql-bootstrap/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.208458 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8137bb36-2761-459f-a700-3c497dbe0937/mysql-bootstrap/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.337823 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8137bb36-2761-459f-a700-3c497dbe0937/mysql-bootstrap/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.470878 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8137bb36-2761-459f-a700-3c497dbe0937/galera/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.522350 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_165d12a6-fb5d-4a40-a903-3d8176434969/openstackclient/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.733070 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_278a021a-4088-4dea-809d-3068fff9357b/adoption/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.874488 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_365b5cdc-86c4-4d46-b368-c12553375bce/openstack-network-exporter/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.885258 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_be3159e3-fc48-4a94-a13f-6f179e5d8ad9/memcached/0.log" Mar 10 23:42:12 crc kubenswrapper[4919]: I0310 23:42:12.909383 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_365b5cdc-86c4-4d46-b368-c12553375bce/ovn-northd/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.055894 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a/openstack-network-exporter/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.098227 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cf54d78f-9a52-4e30-9f54-ebbe74ad8c6a/ovsdbserver-nb/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.180950 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_40dccac8-8ebf-4106-8211-8baffde0a119/openstack-network-exporter/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.247033 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_40dccac8-8ebf-4106-8211-8baffde0a119/ovsdbserver-nb/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.326646 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_b407bebc-85fa-422e-a5ec-e6d586e4ae11/openstack-network-exporter/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.380185 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_b407bebc-85fa-422e-a5ec-e6d586e4ae11/ovsdbserver-nb/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.452221 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_814add4c-f2ef-480d-b701-5c1ea6b8a834/openstack-network-exporter/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.555631 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_814add4c-f2ef-480d-b701-5c1ea6b8a834/ovsdbserver-sb/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.690057 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_8746d615-5c8a-463f-9b24-b8a4e86fd413/openstack-network-exporter/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.710224 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_8746d615-5c8a-463f-9b24-b8a4e86fd413/ovsdbserver-sb/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.836252 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_c0c2cff6-2a56-4b36-b872-cdafb3bf419a/openstack-network-exporter/0.log" Mar 10 23:42:13 crc kubenswrapper[4919]: I0310 23:42:13.872663 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_c0c2cff6-2a56-4b36-b872-cdafb3bf419a/ovsdbserver-sb/0.log" Mar 10 23:42:14 crc kubenswrapper[4919]: I0310 23:42:14.013231 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2101a481-5a05-4fe5-ae97-d3dd73ee8153/setup-container/0.log" Mar 10 23:42:14 crc kubenswrapper[4919]: I0310 23:42:14.172895 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2101a481-5a05-4fe5-ae97-d3dd73ee8153/rabbitmq/0.log" Mar 10 23:42:14 crc kubenswrapper[4919]: I0310 23:42:14.187989 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7a0cd471-ab8f-4de8-bd7f-0392d7d7f903/setup-container/0.log" Mar 10 23:42:14 crc kubenswrapper[4919]: I0310 23:42:14.209917 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2101a481-5a05-4fe5-ae97-d3dd73ee8153/setup-container/0.log" Mar 10 23:42:14 crc kubenswrapper[4919]: I0310 23:42:14.355179 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7a0cd471-ab8f-4de8-bd7f-0392d7d7f903/setup-container/0.log" Mar 10 23:42:14 crc kubenswrapper[4919]: I0310 23:42:14.376690 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7a0cd471-ab8f-4de8-bd7f-0392d7d7f903/rabbitmq/0.log" Mar 10 23:42:29 crc kubenswrapper[4919]: I0310 23:42:29.402345 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872_b0b2027b-b395-4af2-967e-77bdd5ccc44e/util/0.log" Mar 10 23:42:29 crc kubenswrapper[4919]: I0310 23:42:29.556437 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872_b0b2027b-b395-4af2-967e-77bdd5ccc44e/util/0.log" Mar 10 23:42:29 crc kubenswrapper[4919]: I0310 23:42:29.620266 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872_b0b2027b-b395-4af2-967e-77bdd5ccc44e/pull/0.log" Mar 10 23:42:29 crc kubenswrapper[4919]: I0310 23:42:29.620916 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872_b0b2027b-b395-4af2-967e-77bdd5ccc44e/pull/0.log" Mar 10 23:42:29 crc kubenswrapper[4919]: I0310 23:42:29.765700 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872_b0b2027b-b395-4af2-967e-77bdd5ccc44e/util/0.log" Mar 10 23:42:29 crc kubenswrapper[4919]: I0310 23:42:29.768705 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872_b0b2027b-b395-4af2-967e-77bdd5ccc44e/pull/0.log" Mar 10 23:42:29 crc kubenswrapper[4919]: I0310 23:42:29.786064 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49ba6q872_b0b2027b-b395-4af2-967e-77bdd5ccc44e/extract/0.log" Mar 10 23:42:30 crc kubenswrapper[4919]: I0310 23:42:30.163629 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-tw8tv_25bf7b22-52f2-40ef-bd19-efe9c061e8b8/manager/0.log" Mar 10 23:42:30 crc kubenswrapper[4919]: I0310 23:42:30.444765 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-5jgr2_1b4083a5-cc88-4c92-9612-f06c0a36936d/manager/0.log" Mar 10 23:42:30 crc kubenswrapper[4919]: I0310 23:42:30.673591 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-4bbgm_ce86157b-1544-4db8-8e8c-20d1ec8dde0d/manager/0.log" Mar 10 23:42:30 crc kubenswrapper[4919]: I0310 23:42:30.872224 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-tsrgx_3de5f2ad-6f5f-4e54-99ad-0d00736dfdab/manager/0.log" Mar 10 23:42:31 crc kubenswrapper[4919]: I0310 23:42:31.276446 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-7gm62_8b54176b-b55a-43cd-9492-6f7d10b4e637/manager/0.log" Mar 10 23:42:31 crc kubenswrapper[4919]: I0310 23:42:31.600380 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-sxg86_a197f90a-0c8f-47e6-ad18-f3c61cd51445/manager/0.log" Mar 10 23:42:31 crc kubenswrapper[4919]: I0310 23:42:31.734036 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-b75jn_200ec9b1-fcd0-4699-9002-7efdc5447a6d/manager/0.log" Mar 10 23:42:31 crc kubenswrapper[4919]: I0310 23:42:31.935164 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-7fkgj_7c849cfd-ade6-46ce-80f0-09df981cdafd/manager/0.log" Mar 10 23:42:32 crc kubenswrapper[4919]: I0310 23:42:32.187679 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-fjrhv_588d49a0-7a32-4b7a-be73-ec3897d4653b/manager/0.log" Mar 10 23:42:32 crc kubenswrapper[4919]: I0310 23:42:32.430720 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-tm5dc_7fb7e7ce-d1a7-41e2-876e-42808a70c9e2/manager/0.log" Mar 10 23:42:32 crc kubenswrapper[4919]: I0310 23:42:32.449863 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-tvmsl_cbf180f9-e934-462e-926a-520b21f22550/manager/0.log" Mar 10 23:42:32 crc kubenswrapper[4919]: I0310 23:42:32.750349 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-7llx7_d9571b0c-bd43-4789-942b-f833e4166418/manager/0.log" Mar 10 23:42:32 crc kubenswrapper[4919]: I0310 23:42:32.760861 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-dk2nj_d6ac04fc-f3ea-4b69-aba1-b27490967c0e/manager/0.log" Mar 10 23:42:33 crc kubenswrapper[4919]: I0310 23:42:33.136717 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6647d7885f8bgkq_768191d9-b4b4-44da-a525-b2ba92e1ceea/manager/0.log" Mar 10 23:42:33 crc kubenswrapper[4919]: I0310 23:42:33.552248 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6cf8df7788-mq4dj_679d8d06-0146-48c0-b8dc-c26063604a77/operator/0.log" Mar 10 23:42:33 crc kubenswrapper[4919]: I0310 23:42:33.678795 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nh4zc_e0cd634b-f079-4a58-9ac7-7c4f7e90756f/registry-server/0.log" Mar 10 23:42:33 crc kubenswrapper[4919]: I0310 23:42:33.875488 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-xxntv_20945b47-e70f-45b2-b137-9525a0ec1b31/manager/0.log" Mar 10 23:42:33 crc kubenswrapper[4919]: I0310 23:42:33.953106 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-9ccwt_512df763-915b-447f-b5c3-0756788212d6/manager/0.log" Mar 10 23:42:34 crc kubenswrapper[4919]: I0310 23:42:34.121751 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-rmbn4_a60700c9-46a9-4e84-9c13-afbb96d42f55/operator/0.log" Mar 10 23:42:34 crc kubenswrapper[4919]: I0310 23:42:34.329429 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-drwxk_2da3cd1a-93e3-4487-ab1f-b022662e06c0/manager/0.log" Mar 10 23:42:34 crc kubenswrapper[4919]: I0310 23:42:34.432351 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-dds4c_83ccb82d-2701-46e7-9aa9-3ed962bc31e0/manager/0.log" Mar 10 23:42:34 crc kubenswrapper[4919]: I0310 23:42:34.542132 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-kl9vj_fca37d2a-51b1-4b60-a7e5-0ebfbf87fb04/manager/0.log" Mar 10 23:42:34 crc kubenswrapper[4919]: I0310 23:42:34.714902 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-lkpnv_fb78ece6-180c-4237-8017-ec3087e0f47b/manager/0.log" Mar 10 23:42:34 crc kubenswrapper[4919]: I0310 23:42:34.899254 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6679ddfdc7-pnxdc_fd276fb4-a047-472f-903a-8b343ec3894b/manager/0.log" Mar 10 23:42:41 crc kubenswrapper[4919]: I0310 23:42:41.801789 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-k9qtp_c6a398e1-5a44-4601-98c8-9ac478b0502c/manager/0.log" Mar 10 23:42:48 crc kubenswrapper[4919]: I0310 23:42:48.992263 4919 scope.go:117] "RemoveContainer" containerID="9b716b3991689d9dc6745310d9df4ff0c9a599514415ae3497bf51a4dcb7568f" Mar 10 23:42:53 crc kubenswrapper[4919]: I0310 23:42:53.893835 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qnjs8_425d1938-0668-4f53-aaee-dbc4a93297c7/control-plane-machine-set-operator/0.log" Mar 10 23:42:54 crc kubenswrapper[4919]: I0310 23:42:54.062967 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mvwrl_69fc2ea4-6491-4109-8f4a-8b8fb369dcce/kube-rbac-proxy/0.log" Mar 10 23:42:54 crc kubenswrapper[4919]: I0310 23:42:54.063714 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mvwrl_69fc2ea4-6491-4109-8f4a-8b8fb369dcce/machine-api-operator/0.log" Mar 10 23:43:06 crc kubenswrapper[4919]: I0310 23:43:06.508159 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-tvl6b_febfa755-4560-47ec-9358-7a73e1336fb9/cert-manager-controller/0.log" Mar 10 23:43:06 crc kubenswrapper[4919]: I0310 23:43:06.667898 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-svlfd_515ce4db-5f17-4b18-894e-f93e7f82459c/cert-manager-cainjector/0.log" Mar 10 23:43:06 crc kubenswrapper[4919]: I0310 23:43:06.714843 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-c6lpm_11c3a36f-b65a-420e-aeaa-c1d372444660/cert-manager-webhook/0.log" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.120185 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5v5ct"] Mar 10 23:43:19 crc kubenswrapper[4919]: E0310 23:43:19.121057 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5c6cdf-ce81-4b26-92f3-9131bcdcd530" containerName="oc" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.121070 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5c6cdf-ce81-4b26-92f3-9131bcdcd530" containerName="oc" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.121257 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5c6cdf-ce81-4b26-92f3-9131bcdcd530" containerName="oc" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.124194 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.148930 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5v5ct"] Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.201734 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-utilities\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.201805 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wctcw\" (UniqueName: \"kubernetes.io/projected/235a617f-ca0a-4cf0-840b-0615a73b696c-kube-api-access-wctcw\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.202205 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-catalog-content\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.303818 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-catalog-content\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.303961 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-utilities\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.303988 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wctcw\" (UniqueName: \"kubernetes.io/projected/235a617f-ca0a-4cf0-840b-0615a73b696c-kube-api-access-wctcw\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.305004 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-catalog-content\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.305015 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-utilities\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.331374 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wctcw\" (UniqueName: \"kubernetes.io/projected/235a617f-ca0a-4cf0-840b-0615a73b696c-kube-api-access-wctcw\") pod \"certified-operators-5v5ct\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.469420 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.646870 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-wv4hr_357040e9-d683-4de3-bf54-c414218b1705/nmstate-console-plugin/0.log" Mar 10 23:43:19 crc kubenswrapper[4919]: I0310 23:43:19.779244 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5v5ct"] Mar 10 23:43:20 crc kubenswrapper[4919]: I0310 23:43:20.085624 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-h5rdm_556dc7e4-4c98-4f59-8e60-a7997f766706/kube-rbac-proxy/0.log" Mar 10 23:43:20 crc kubenswrapper[4919]: I0310 23:43:20.239845 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-h5rdm_556dc7e4-4c98-4f59-8e60-a7997f766706/nmstate-metrics/0.log" Mar 10 23:43:20 crc kubenswrapper[4919]: I0310 23:43:20.242224 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-whpkz_d7074097-fdce-4f8e-b343-2c1196f997a2/nmstate-handler/0.log" Mar 10 23:43:20 crc kubenswrapper[4919]: I0310 23:43:20.333376 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-pgjbr_7ff7f3f6-029a-4a7e-818c-658741a6afe9/nmstate-operator/0.log" Mar 10 23:43:20 crc kubenswrapper[4919]: I0310 23:43:20.406474 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-n9hfz_ff569aa2-f933-44e4-bd70-ba0ff19efd02/nmstate-webhook/0.log" Mar 10 23:43:20 crc kubenswrapper[4919]: I0310 23:43:20.413356 4919 generic.go:334] "Generic (PLEG): container finished" podID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerID="9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e" exitCode=0 Mar 10 23:43:20 crc kubenswrapper[4919]: I0310 23:43:20.413421 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v5ct" event={"ID":"235a617f-ca0a-4cf0-840b-0615a73b696c","Type":"ContainerDied","Data":"9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e"} Mar 10 23:43:20 crc kubenswrapper[4919]: I0310 23:43:20.413473 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v5ct" event={"ID":"235a617f-ca0a-4cf0-840b-0615a73b696c","Type":"ContainerStarted","Data":"3abed13f26e2d83a9a07d10fd3d45a10bc18b13b88b7108d92d12ae805a0c5e7"} Mar 10 23:43:21 crc kubenswrapper[4919]: I0310 23:43:21.422012 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v5ct" event={"ID":"235a617f-ca0a-4cf0-840b-0615a73b696c","Type":"ContainerStarted","Data":"ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63"} Mar 10 23:43:22 crc kubenswrapper[4919]: I0310 23:43:22.430173 4919 generic.go:334] "Generic (PLEG): container finished" podID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerID="ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63" exitCode=0 Mar 10 23:43:22 crc kubenswrapper[4919]: I0310 23:43:22.430253 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v5ct" event={"ID":"235a617f-ca0a-4cf0-840b-0615a73b696c","Type":"ContainerDied","Data":"ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63"} Mar 10 23:43:23 crc kubenswrapper[4919]: I0310 23:43:23.442523 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v5ct" event={"ID":"235a617f-ca0a-4cf0-840b-0615a73b696c","Type":"ContainerStarted","Data":"00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37"} Mar 10 23:43:23 crc kubenswrapper[4919]: I0310 23:43:23.462518 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5v5ct" podStartSLOduration=2.016637257 podStartE2EDuration="4.462499462s" podCreationTimestamp="2026-03-10 23:43:19 +0000 UTC" firstStartedPulling="2026-03-10 23:43:20.41519477 +0000 UTC m=+6787.657075378" lastFinishedPulling="2026-03-10 23:43:22.861056975 +0000 UTC m=+6790.102937583" observedRunningTime="2026-03-10 23:43:23.459761267 +0000 UTC m=+6790.701641875" watchObservedRunningTime="2026-03-10 23:43:23.462499462 +0000 UTC m=+6790.704380090" Mar 10 23:43:29 crc kubenswrapper[4919]: I0310 23:43:29.175609 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:43:29 crc kubenswrapper[4919]: I0310 23:43:29.176290 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:43:29 crc kubenswrapper[4919]: I0310 23:43:29.470005 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:29 crc kubenswrapper[4919]: I0310 23:43:29.471262 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:29 crc kubenswrapper[4919]: I0310 23:43:29.517509 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:29 crc kubenswrapper[4919]: I0310 23:43:29.600461 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:32 crc kubenswrapper[4919]: I0310 23:43:32.114734 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5v5ct"] Mar 10 23:43:32 crc kubenswrapper[4919]: I0310 23:43:32.544368 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5v5ct" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerName="registry-server" containerID="cri-o://00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37" gracePeriod=2 Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.033423 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.136859 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-catalog-content\") pod \"235a617f-ca0a-4cf0-840b-0615a73b696c\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.136994 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-utilities\") pod \"235a617f-ca0a-4cf0-840b-0615a73b696c\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.137062 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wctcw\" (UniqueName: \"kubernetes.io/projected/235a617f-ca0a-4cf0-840b-0615a73b696c-kube-api-access-wctcw\") pod \"235a617f-ca0a-4cf0-840b-0615a73b696c\" (UID: \"235a617f-ca0a-4cf0-840b-0615a73b696c\") " Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.138431 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-utilities" (OuterVolumeSpecName: "utilities") pod "235a617f-ca0a-4cf0-840b-0615a73b696c" (UID: "235a617f-ca0a-4cf0-840b-0615a73b696c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.143196 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235a617f-ca0a-4cf0-840b-0615a73b696c-kube-api-access-wctcw" (OuterVolumeSpecName: "kube-api-access-wctcw") pod "235a617f-ca0a-4cf0-840b-0615a73b696c" (UID: "235a617f-ca0a-4cf0-840b-0615a73b696c"). InnerVolumeSpecName "kube-api-access-wctcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.206278 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "235a617f-ca0a-4cf0-840b-0615a73b696c" (UID: "235a617f-ca0a-4cf0-840b-0615a73b696c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.239112 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.239331 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wctcw\" (UniqueName: \"kubernetes.io/projected/235a617f-ca0a-4cf0-840b-0615a73b696c-kube-api-access-wctcw\") on node \"crc\" DevicePath \"\"" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.239446 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/235a617f-ca0a-4cf0-840b-0615a73b696c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.552782 4919 generic.go:334] "Generic (PLEG): container finished" podID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerID="00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37" exitCode=0 Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.552830 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v5ct" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.552845 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v5ct" event={"ID":"235a617f-ca0a-4cf0-840b-0615a73b696c","Type":"ContainerDied","Data":"00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37"} Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.553277 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v5ct" event={"ID":"235a617f-ca0a-4cf0-840b-0615a73b696c","Type":"ContainerDied","Data":"3abed13f26e2d83a9a07d10fd3d45a10bc18b13b88b7108d92d12ae805a0c5e7"} Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.553309 4919 scope.go:117] "RemoveContainer" containerID="00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.568678 4919 scope.go:117] "RemoveContainer" containerID="ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.573542 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5v5ct"] Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.581066 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5v5ct"] Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.584447 4919 scope.go:117] "RemoveContainer" containerID="9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.618031 4919 scope.go:117] "RemoveContainer" containerID="00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37" Mar 10 23:43:33 crc kubenswrapper[4919]: E0310 23:43:33.618591 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37\": container with ID starting with 00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37 not found: ID does not exist" containerID="00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.618625 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37"} err="failed to get container status \"00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37\": rpc error: code = NotFound desc = could not find container \"00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37\": container with ID starting with 00d42170298fd5739b5f2b0f796079d5fa154fda0caaf9ac8fdbd21f00a1aa37 not found: ID does not exist" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.618646 4919 scope.go:117] "RemoveContainer" containerID="ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63" Mar 10 23:43:33 crc kubenswrapper[4919]: E0310 23:43:33.618971 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63\": container with ID starting with ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63 not found: ID does not exist" containerID="ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.618995 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63"} err="failed to get container status \"ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63\": rpc error: code = NotFound desc = could not find container \"ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63\": container with ID starting with ddc152bf6a66d665dbd179cd4569fc831c313f2de0eb6bc7efe84c292eaffb63 not found: ID does not exist" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.619010 4919 scope.go:117] "RemoveContainer" containerID="9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e" Mar 10 23:43:33 crc kubenswrapper[4919]: E0310 23:43:33.619244 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e\": container with ID starting with 9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e not found: ID does not exist" containerID="9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e" Mar 10 23:43:33 crc kubenswrapper[4919]: I0310 23:43:33.619266 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e"} err="failed to get container status \"9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e\": rpc error: code = NotFound desc = could not find container \"9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e\": container with ID starting with 9d727e8df19a5818781e8645b309cb3e38f89a4d8f3a1c8cdf71c32694712c6e not found: ID does not exist" Mar 10 23:43:35 crc kubenswrapper[4919]: I0310 23:43:35.490703 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" path="/var/lib/kubelet/pods/235a617f-ca0a-4cf0-840b-0615a73b696c/volumes" Mar 10 23:43:47 crc kubenswrapper[4919]: I0310 23:43:47.455084 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-djm9l_1fc5122b-0945-47cf-8a35-cd496338269b/kube-rbac-proxy/0.log" Mar 10 23:43:47 crc kubenswrapper[4919]: I0310 23:43:47.726138 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-frr-files/0.log" Mar 10 23:43:47 crc kubenswrapper[4919]: I0310 23:43:47.767130 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-djm9l_1fc5122b-0945-47cf-8a35-cd496338269b/controller/0.log" Mar 10 23:43:47 crc kubenswrapper[4919]: I0310 23:43:47.888220 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-reloader/0.log" Mar 10 23:43:47 crc kubenswrapper[4919]: I0310 23:43:47.900658 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-frr-files/0.log" Mar 10 23:43:47 crc kubenswrapper[4919]: I0310 23:43:47.928118 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-reloader/0.log" Mar 10 23:43:47 crc kubenswrapper[4919]: I0310 23:43:47.959696 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-metrics/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.155897 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-metrics/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.157588 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-frr-files/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.164081 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-metrics/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.165689 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-reloader/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.349416 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-metrics/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.349496 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-frr-files/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.369449 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/cp-reloader/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.391132 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/controller/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.517492 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/frr-metrics/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.529934 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/kube-rbac-proxy/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.600054 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/kube-rbac-proxy-frr/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.731913 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/reloader/0.log" Mar 10 23:43:48 crc kubenswrapper[4919]: I0310 23:43:48.842616 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-l4f9h_522e3074-d06d-4537-b6e4-cd60e9d7c216/frr-k8s-webhook-server/0.log" Mar 10 23:43:49 crc kubenswrapper[4919]: I0310 23:43:49.165962 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-696968d477-7csvb_180eb62c-34a6-4361-856a-419f01dc12df/manager/0.log" Mar 10 23:43:49 crc kubenswrapper[4919]: I0310 23:43:49.316921 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-867bfdd6b6-88z9n_d80e3fd6-53cc-4b97-83c0-45a7b093a415/webhook-server/0.log" Mar 10 23:43:49 crc kubenswrapper[4919]: I0310 23:43:49.445015 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j9g4p_e6950f65-bbda-4846-9826-042bd5dbaf87/kube-rbac-proxy/0.log" Mar 10 23:43:50 crc kubenswrapper[4919]: I0310 23:43:50.107456 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j9g4p_e6950f65-bbda-4846-9826-042bd5dbaf87/speaker/0.log" Mar 10 23:43:50 crc kubenswrapper[4919]: I0310 23:43:50.582144 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tfrn4_81fac61b-b480-427e-ba18-1c699bf5620a/frr/0.log" Mar 10 23:43:59 crc kubenswrapper[4919]: I0310 23:43:59.175632 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:43:59 crc kubenswrapper[4919]: I0310 23:43:59.176190 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.141146 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553104-bkhwg"] Mar 10 23:44:00 crc kubenswrapper[4919]: E0310 23:44:00.141568 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerName="extract-content" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.141584 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerName="extract-content" Mar 10 23:44:00 crc kubenswrapper[4919]: E0310 23:44:00.141605 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerName="extract-utilities" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.141612 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerName="extract-utilities" Mar 10 23:44:00 crc kubenswrapper[4919]: E0310 23:44:00.141623 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerName="registry-server" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.141629 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerName="registry-server" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.141795 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="235a617f-ca0a-4cf0-840b-0615a73b696c" containerName="registry-server" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.142362 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.145179 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.145613 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.145724 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.157178 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553104-bkhwg"] Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.320451 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqhxl\" (UniqueName: \"kubernetes.io/projected/2f5d647e-0f3c-406c-8537-a877ee2be1dc-kube-api-access-pqhxl\") pod \"auto-csr-approver-29553104-bkhwg\" (UID: \"2f5d647e-0f3c-406c-8537-a877ee2be1dc\") " pod="openshift-infra/auto-csr-approver-29553104-bkhwg" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.422440 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqhxl\" (UniqueName: \"kubernetes.io/projected/2f5d647e-0f3c-406c-8537-a877ee2be1dc-kube-api-access-pqhxl\") pod \"auto-csr-approver-29553104-bkhwg\" (UID: \"2f5d647e-0f3c-406c-8537-a877ee2be1dc\") " pod="openshift-infra/auto-csr-approver-29553104-bkhwg" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.453675 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqhxl\" (UniqueName: \"kubernetes.io/projected/2f5d647e-0f3c-406c-8537-a877ee2be1dc-kube-api-access-pqhxl\") pod \"auto-csr-approver-29553104-bkhwg\" (UID: \"2f5d647e-0f3c-406c-8537-a877ee2be1dc\") " pod="openshift-infra/auto-csr-approver-29553104-bkhwg" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.460153 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" Mar 10 23:44:00 crc kubenswrapper[4919]: I0310 23:44:00.988648 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553104-bkhwg"] Mar 10 23:44:01 crc kubenswrapper[4919]: I0310 23:44:01.787535 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" event={"ID":"2f5d647e-0f3c-406c-8537-a877ee2be1dc","Type":"ContainerStarted","Data":"c72ebee0fac027ae1386532b870193f6dc1ed9025b4060cd7548843b8084a702"} Mar 10 23:44:02 crc kubenswrapper[4919]: I0310 23:44:02.798303 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" event={"ID":"2f5d647e-0f3c-406c-8537-a877ee2be1dc","Type":"ContainerStarted","Data":"40152a02c4cae6c3b4ad9046f70cc7817735db9bd3a38bb01ef3add16606441b"} Mar 10 23:44:02 crc kubenswrapper[4919]: I0310 23:44:02.814439 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" podStartSLOduration=1.565656546 podStartE2EDuration="2.814418327s" podCreationTimestamp="2026-03-10 23:44:00 +0000 UTC" firstStartedPulling="2026-03-10 23:44:01.015568107 +0000 UTC m=+6828.257448715" lastFinishedPulling="2026-03-10 23:44:02.264329878 +0000 UTC m=+6829.506210496" observedRunningTime="2026-03-10 23:44:02.809872812 +0000 UTC m=+6830.051753430" watchObservedRunningTime="2026-03-10 23:44:02.814418327 +0000 UTC m=+6830.056298935" Mar 10 23:44:03 crc kubenswrapper[4919]: I0310 23:44:03.824422 4919 generic.go:334] "Generic (PLEG): container finished" podID="2f5d647e-0f3c-406c-8537-a877ee2be1dc" containerID="40152a02c4cae6c3b4ad9046f70cc7817735db9bd3a38bb01ef3add16606441b" exitCode=0 Mar 10 23:44:03 crc kubenswrapper[4919]: I0310 23:44:03.824473 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" event={"ID":"2f5d647e-0f3c-406c-8537-a877ee2be1dc","Type":"ContainerDied","Data":"40152a02c4cae6c3b4ad9046f70cc7817735db9bd3a38bb01ef3add16606441b"} Mar 10 23:44:03 crc kubenswrapper[4919]: I0310 23:44:03.866434 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5_0bc29c7c-1201-487e-8c9a-5d802dde51a5/util/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.114541 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5_0bc29c7c-1201-487e-8c9a-5d802dde51a5/pull/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.124091 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5_0bc29c7c-1201-487e-8c9a-5d802dde51a5/pull/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.139332 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5_0bc29c7c-1201-487e-8c9a-5d802dde51a5/util/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.326091 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5_0bc29c7c-1201-487e-8c9a-5d802dde51a5/util/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.352899 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5_0bc29c7c-1201-487e-8c9a-5d802dde51a5/pull/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.357270 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828gtn5_0bc29c7c-1201-487e-8c9a-5d802dde51a5/extract/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.490304 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2_e5fad1c3-3133-4fca-8614-ce814b312e72/util/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.649692 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2_e5fad1c3-3133-4fca-8614-ce814b312e72/util/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.671888 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2_e5fad1c3-3133-4fca-8614-ce814b312e72/pull/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.728807 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2_e5fad1c3-3133-4fca-8614-ce814b312e72/pull/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.877726 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2_e5fad1c3-3133-4fca-8614-ce814b312e72/util/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.885410 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2_e5fad1c3-3133-4fca-8614-ce814b312e72/pull/0.log" Mar 10 23:44:04 crc kubenswrapper[4919]: I0310 23:44:04.886522 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lktf2_e5fad1c3-3133-4fca-8614-ce814b312e72/extract/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.077720 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s9czm_2f6ee806-3545-424a-8b52-3116d438d035/extract-utilities/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.147325 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.320921 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqhxl\" (UniqueName: \"kubernetes.io/projected/2f5d647e-0f3c-406c-8537-a877ee2be1dc-kube-api-access-pqhxl\") pod \"2f5d647e-0f3c-406c-8537-a877ee2be1dc\" (UID: \"2f5d647e-0f3c-406c-8537-a877ee2be1dc\") " Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.328522 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5d647e-0f3c-406c-8537-a877ee2be1dc-kube-api-access-pqhxl" (OuterVolumeSpecName: "kube-api-access-pqhxl") pod "2f5d647e-0f3c-406c-8537-a877ee2be1dc" (UID: "2f5d647e-0f3c-406c-8537-a877ee2be1dc"). InnerVolumeSpecName "kube-api-access-pqhxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.342019 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s9czm_2f6ee806-3545-424a-8b52-3116d438d035/extract-content/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.358407 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s9czm_2f6ee806-3545-424a-8b52-3116d438d035/extract-utilities/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.369819 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s9czm_2f6ee806-3545-424a-8b52-3116d438d035/extract-content/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.422757 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqhxl\" (UniqueName: \"kubernetes.io/projected/2f5d647e-0f3c-406c-8537-a877ee2be1dc-kube-api-access-pqhxl\") on node \"crc\" DevicePath \"\"" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.531937 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s9czm_2f6ee806-3545-424a-8b52-3116d438d035/extract-utilities/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.543079 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s9czm_2f6ee806-3545-424a-8b52-3116d438d035/extract-content/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.739179 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tbwp6_eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f/extract-utilities/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.810728 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s9czm_2f6ee806-3545-424a-8b52-3116d438d035/registry-server/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.839337 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" event={"ID":"2f5d647e-0f3c-406c-8537-a877ee2be1dc","Type":"ContainerDied","Data":"c72ebee0fac027ae1386532b870193f6dc1ed9025b4060cd7548843b8084a702"} Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.839405 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c72ebee0fac027ae1386532b870193f6dc1ed9025b4060cd7548843b8084a702" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.839660 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553104-bkhwg" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.877540 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553098-96g44"] Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.882718 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553098-96g44"] Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.914341 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tbwp6_eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f/extract-content/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.920129 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tbwp6_eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f/extract-utilities/0.log" Mar 10 23:44:05 crc kubenswrapper[4919]: I0310 23:44:05.928175 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tbwp6_eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f/extract-content/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.137291 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tbwp6_eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f/extract-utilities/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.150753 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tbwp6_eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f/extract-content/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.360466 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv_6a1cfdf8-3455-43cb-9462-f4ad3632c7c6/util/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.522134 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv_6a1cfdf8-3455-43cb-9462-f4ad3632c7c6/util/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.541651 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv_6a1cfdf8-3455-43cb-9462-f4ad3632c7c6/pull/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.621968 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv_6a1cfdf8-3455-43cb-9462-f4ad3632c7c6/pull/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.789144 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv_6a1cfdf8-3455-43cb-9462-f4ad3632c7c6/pull/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.790259 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv_6a1cfdf8-3455-43cb-9462-f4ad3632c7c6/util/0.log" Mar 10 23:44:06 crc kubenswrapper[4919]: I0310 23:44:06.835505 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4d2csv_6a1cfdf8-3455-43cb-9462-f4ad3632c7c6/extract/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.029009 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-w82bl_eacecaf1-f17c-4c5e-8a68-8b1cb1e01006/marketplace-operator/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.197756 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qjwrf_3fa4689a-13af-4ed2-b9be-699f7bd519c1/extract-utilities/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.340968 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tbwp6_eb2ad0d8-c5f7-43c3-b37c-2ec6c330703f/registry-server/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.359200 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qjwrf_3fa4689a-13af-4ed2-b9be-699f7bd519c1/extract-utilities/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.382455 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qjwrf_3fa4689a-13af-4ed2-b9be-699f7bd519c1/extract-content/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.388526 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qjwrf_3fa4689a-13af-4ed2-b9be-699f7bd519c1/extract-content/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.489023 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf95132-2ab1-4884-8911-64d13302fb1b" path="/var/lib/kubelet/pods/dbf95132-2ab1-4884-8911-64d13302fb1b/volumes" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.585354 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qjwrf_3fa4689a-13af-4ed2-b9be-699f7bd519c1/extract-content/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.604893 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qjwrf_3fa4689a-13af-4ed2-b9be-699f7bd519c1/extract-utilities/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.770498 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qjwrf_3fa4689a-13af-4ed2-b9be-699f7bd519c1/registry-server/0.log" Mar 10 23:44:07 crc kubenswrapper[4919]: I0310 23:44:07.800367 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w5gcl_f6580e1d-0db8-4e06-9690-4fca67b2604a/extract-utilities/0.log" Mar 10 23:44:08 crc kubenswrapper[4919]: I0310 23:44:08.086430 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w5gcl_f6580e1d-0db8-4e06-9690-4fca67b2604a/extract-content/0.log" Mar 10 23:44:08 crc kubenswrapper[4919]: I0310 23:44:08.094268 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w5gcl_f6580e1d-0db8-4e06-9690-4fca67b2604a/extract-utilities/0.log" Mar 10 23:44:08 crc kubenswrapper[4919]: I0310 23:44:08.118506 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w5gcl_f6580e1d-0db8-4e06-9690-4fca67b2604a/extract-content/0.log" Mar 10 23:44:08 crc kubenswrapper[4919]: I0310 23:44:08.245174 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w5gcl_f6580e1d-0db8-4e06-9690-4fca67b2604a/extract-utilities/0.log" Mar 10 23:44:08 crc kubenswrapper[4919]: I0310 23:44:08.292998 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w5gcl_f6580e1d-0db8-4e06-9690-4fca67b2604a/extract-content/0.log" Mar 10 23:44:08 crc kubenswrapper[4919]: I0310 23:44:08.783080 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-w5gcl_f6580e1d-0db8-4e06-9690-4fca67b2604a/registry-server/0.log" Mar 10 23:44:29 crc kubenswrapper[4919]: I0310 23:44:29.175334 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:44:29 crc kubenswrapper[4919]: I0310 23:44:29.176149 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:44:29 crc kubenswrapper[4919]: I0310 23:44:29.176214 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:44:29 crc kubenswrapper[4919]: I0310 23:44:29.177127 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e18b7db8c70c24109d95f2371d5b652db18ddf5ac55b9da74904e0d7d46a12e3"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:44:29 crc kubenswrapper[4919]: I0310 23:44:29.177209 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://e18b7db8c70c24109d95f2371d5b652db18ddf5ac55b9da74904e0d7d46a12e3" gracePeriod=600 Mar 10 23:44:30 crc kubenswrapper[4919]: I0310 23:44:30.069484 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="e18b7db8c70c24109d95f2371d5b652db18ddf5ac55b9da74904e0d7d46a12e3" exitCode=0 Mar 10 23:44:30 crc kubenswrapper[4919]: I0310 23:44:30.069575 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"e18b7db8c70c24109d95f2371d5b652db18ddf5ac55b9da74904e0d7d46a12e3"} Mar 10 23:44:30 crc kubenswrapper[4919]: I0310 23:44:30.069933 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerStarted","Data":"944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d"} Mar 10 23:44:30 crc kubenswrapper[4919]: I0310 23:44:30.069951 4919 scope.go:117] "RemoveContainer" containerID="01b5110cee4e2da7f2a13bbbe666538ca45148371492ad70b005d779b7734aee" Mar 10 23:44:49 crc kubenswrapper[4919]: I0310 23:44:49.133852 4919 scope.go:117] "RemoveContainer" containerID="b3249d607beec9c6d4c28c761bc18363f0bafc78bec3d8ff65101e81f8e2af57" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.161634 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm"] Mar 10 23:45:00 crc kubenswrapper[4919]: E0310 23:45:00.162576 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5d647e-0f3c-406c-8537-a877ee2be1dc" containerName="oc" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.162595 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5d647e-0f3c-406c-8537-a877ee2be1dc" containerName="oc" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.162796 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5d647e-0f3c-406c-8537-a877ee2be1dc" containerName="oc" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.163508 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.166111 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.166881 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.184293 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm"] Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.262727 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhb5k\" (UniqueName: \"kubernetes.io/projected/07e25d93-9442-40b1-9733-d1e9e03a3e34-kube-api-access-rhb5k\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.262793 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07e25d93-9442-40b1-9733-d1e9e03a3e34-secret-volume\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.263263 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07e25d93-9442-40b1-9733-d1e9e03a3e34-config-volume\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.364956 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07e25d93-9442-40b1-9733-d1e9e03a3e34-config-volume\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.365038 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhb5k\" (UniqueName: \"kubernetes.io/projected/07e25d93-9442-40b1-9733-d1e9e03a3e34-kube-api-access-rhb5k\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.365093 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07e25d93-9442-40b1-9733-d1e9e03a3e34-secret-volume\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.373326 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07e25d93-9442-40b1-9733-d1e9e03a3e34-config-volume\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.387462 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07e25d93-9442-40b1-9733-d1e9e03a3e34-secret-volume\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.397361 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhb5k\" (UniqueName: \"kubernetes.io/projected/07e25d93-9442-40b1-9733-d1e9e03a3e34-kube-api-access-rhb5k\") pod \"collect-profiles-29553105-mhztm\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:00 crc kubenswrapper[4919]: I0310 23:45:00.517055 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:01 crc kubenswrapper[4919]: I0310 23:45:01.006465 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm"] Mar 10 23:45:01 crc kubenswrapper[4919]: I0310 23:45:01.392945 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" event={"ID":"07e25d93-9442-40b1-9733-d1e9e03a3e34","Type":"ContainerStarted","Data":"5da00720522c614fd5b202681856f895cf000111b3a17d5cf4870d4327f4646f"} Mar 10 23:45:01 crc kubenswrapper[4919]: I0310 23:45:01.392998 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" event={"ID":"07e25d93-9442-40b1-9733-d1e9e03a3e34","Type":"ContainerStarted","Data":"e52884eb7162561f161ed2daac92597bf1d7ab456b92362b6c0613e579a9d1c2"} Mar 10 23:45:01 crc kubenswrapper[4919]: I0310 23:45:01.421606 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" podStartSLOduration=1.421589767 podStartE2EDuration="1.421589767s" podCreationTimestamp="2026-03-10 23:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 23:45:01.411475411 +0000 UTC m=+6888.653356029" watchObservedRunningTime="2026-03-10 23:45:01.421589767 +0000 UTC m=+6888.663470375" Mar 10 23:45:02 crc kubenswrapper[4919]: I0310 23:45:02.413856 4919 generic.go:334] "Generic (PLEG): container finished" podID="07e25d93-9442-40b1-9733-d1e9e03a3e34" containerID="5da00720522c614fd5b202681856f895cf000111b3a17d5cf4870d4327f4646f" exitCode=0 Mar 10 23:45:02 crc kubenswrapper[4919]: I0310 23:45:02.413910 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" event={"ID":"07e25d93-9442-40b1-9733-d1e9e03a3e34","Type":"ContainerDied","Data":"5da00720522c614fd5b202681856f895cf000111b3a17d5cf4870d4327f4646f"} Mar 10 23:45:03 crc kubenswrapper[4919]: I0310 23:45:03.823008 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:03 crc kubenswrapper[4919]: I0310 23:45:03.945167 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07e25d93-9442-40b1-9733-d1e9e03a3e34-config-volume\") pod \"07e25d93-9442-40b1-9733-d1e9e03a3e34\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " Mar 10 23:45:03 crc kubenswrapper[4919]: I0310 23:45:03.945204 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07e25d93-9442-40b1-9733-d1e9e03a3e34-secret-volume\") pod \"07e25d93-9442-40b1-9733-d1e9e03a3e34\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " Mar 10 23:45:03 crc kubenswrapper[4919]: I0310 23:45:03.945288 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhb5k\" (UniqueName: \"kubernetes.io/projected/07e25d93-9442-40b1-9733-d1e9e03a3e34-kube-api-access-rhb5k\") pod \"07e25d93-9442-40b1-9733-d1e9e03a3e34\" (UID: \"07e25d93-9442-40b1-9733-d1e9e03a3e34\") " Mar 10 23:45:03 crc kubenswrapper[4919]: I0310 23:45:03.956758 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07e25d93-9442-40b1-9733-d1e9e03a3e34-config-volume" (OuterVolumeSpecName: "config-volume") pod "07e25d93-9442-40b1-9733-d1e9e03a3e34" (UID: "07e25d93-9442-40b1-9733-d1e9e03a3e34"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 23:45:03 crc kubenswrapper[4919]: I0310 23:45:03.965705 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e25d93-9442-40b1-9733-d1e9e03a3e34-kube-api-access-rhb5k" (OuterVolumeSpecName: "kube-api-access-rhb5k") pod "07e25d93-9442-40b1-9733-d1e9e03a3e34" (UID: "07e25d93-9442-40b1-9733-d1e9e03a3e34"). InnerVolumeSpecName "kube-api-access-rhb5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:45:03 crc kubenswrapper[4919]: I0310 23:45:03.966241 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e25d93-9442-40b1-9733-d1e9e03a3e34-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "07e25d93-9442-40b1-9733-d1e9e03a3e34" (UID: "07e25d93-9442-40b1-9733-d1e9e03a3e34"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 23:45:04 crc kubenswrapper[4919]: I0310 23:45:04.048625 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhb5k\" (UniqueName: \"kubernetes.io/projected/07e25d93-9442-40b1-9733-d1e9e03a3e34-kube-api-access-rhb5k\") on node \"crc\" DevicePath \"\"" Mar 10 23:45:04 crc kubenswrapper[4919]: I0310 23:45:04.048994 4919 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07e25d93-9442-40b1-9733-d1e9e03a3e34-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 23:45:04 crc kubenswrapper[4919]: I0310 23:45:04.049007 4919 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07e25d93-9442-40b1-9733-d1e9e03a3e34-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 23:45:04 crc kubenswrapper[4919]: I0310 23:45:04.432092 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" event={"ID":"07e25d93-9442-40b1-9733-d1e9e03a3e34","Type":"ContainerDied","Data":"e52884eb7162561f161ed2daac92597bf1d7ab456b92362b6c0613e579a9d1c2"} Mar 10 23:45:04 crc kubenswrapper[4919]: I0310 23:45:04.432127 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e52884eb7162561f161ed2daac92597bf1d7ab456b92362b6c0613e579a9d1c2" Mar 10 23:45:04 crc kubenswrapper[4919]: I0310 23:45:04.432180 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553105-mhztm" Mar 10 23:45:04 crc kubenswrapper[4919]: I0310 23:45:04.505263 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67"] Mar 10 23:45:04 crc kubenswrapper[4919]: I0310 23:45:04.509961 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553060-x2l67"] Mar 10 23:45:05 crc kubenswrapper[4919]: I0310 23:45:05.492656 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31be40f-53ab-4748-b8df-3aa93593e3b5" path="/var/lib/kubelet/pods/c31be40f-53ab-4748-b8df-3aa93593e3b5/volumes" Mar 10 23:45:42 crc kubenswrapper[4919]: I0310 23:45:42.829067 4919 generic.go:334] "Generic (PLEG): container finished" podID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerID="8f34da97e0ebc3975065df1fc1daff70723dcf6dfd96b765745b86a1a50bbd16" exitCode=0 Mar 10 23:45:42 crc kubenswrapper[4919]: I0310 23:45:42.829151 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56pqp/must-gather-7hlmv" event={"ID":"1c06e98a-a64a-424e-b669-1d7815c55b6d","Type":"ContainerDied","Data":"8f34da97e0ebc3975065df1fc1daff70723dcf6dfd96b765745b86a1a50bbd16"} Mar 10 23:45:42 crc kubenswrapper[4919]: I0310 23:45:42.830294 4919 scope.go:117] "RemoveContainer" containerID="8f34da97e0ebc3975065df1fc1daff70723dcf6dfd96b765745b86a1a50bbd16" Mar 10 23:45:43 crc kubenswrapper[4919]: I0310 23:45:43.310738 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-56pqp_must-gather-7hlmv_1c06e98a-a64a-424e-b669-1d7815c55b6d/gather/0.log" Mar 10 23:45:49 crc kubenswrapper[4919]: I0310 23:45:49.188680 4919 scope.go:117] "RemoveContainer" containerID="99a8804d05f7fc8e266f03dedff28752026dcc94407b9d1241858632c5581e1c" Mar 10 23:45:50 crc kubenswrapper[4919]: I0310 23:45:50.620716 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-56pqp/must-gather-7hlmv"] Mar 10 23:45:50 crc kubenswrapper[4919]: I0310 23:45:50.621685 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-56pqp/must-gather-7hlmv" podUID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerName="copy" containerID="cri-o://dadb8fa5b9ff0b602f79f6261e442a2f7a2055d6597718df51d88ec0983d81c9" gracePeriod=2 Mar 10 23:45:50 crc kubenswrapper[4919]: I0310 23:45:50.630428 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-56pqp/must-gather-7hlmv"] Mar 10 23:45:50 crc kubenswrapper[4919]: I0310 23:45:50.896696 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-56pqp_must-gather-7hlmv_1c06e98a-a64a-424e-b669-1d7815c55b6d/copy/0.log" Mar 10 23:45:50 crc kubenswrapper[4919]: I0310 23:45:50.897494 4919 generic.go:334] "Generic (PLEG): container finished" podID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerID="dadb8fa5b9ff0b602f79f6261e442a2f7a2055d6597718df51d88ec0983d81c9" exitCode=143 Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.020769 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-56pqp_must-gather-7hlmv_1c06e98a-a64a-424e-b669-1d7815c55b6d/copy/0.log" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.022194 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.175630 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c06e98a-a64a-424e-b669-1d7815c55b6d-must-gather-output\") pod \"1c06e98a-a64a-424e-b669-1d7815c55b6d\" (UID: \"1c06e98a-a64a-424e-b669-1d7815c55b6d\") " Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.176200 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kbc5\" (UniqueName: \"kubernetes.io/projected/1c06e98a-a64a-424e-b669-1d7815c55b6d-kube-api-access-5kbc5\") pod \"1c06e98a-a64a-424e-b669-1d7815c55b6d\" (UID: \"1c06e98a-a64a-424e-b669-1d7815c55b6d\") " Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.181485 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c06e98a-a64a-424e-b669-1d7815c55b6d-kube-api-access-5kbc5" (OuterVolumeSpecName: "kube-api-access-5kbc5") pod "1c06e98a-a64a-424e-b669-1d7815c55b6d" (UID: "1c06e98a-a64a-424e-b669-1d7815c55b6d"). InnerVolumeSpecName "kube-api-access-5kbc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.278663 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kbc5\" (UniqueName: \"kubernetes.io/projected/1c06e98a-a64a-424e-b669-1d7815c55b6d-kube-api-access-5kbc5\") on node \"crc\" DevicePath \"\"" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.358224 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c06e98a-a64a-424e-b669-1d7815c55b6d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1c06e98a-a64a-424e-b669-1d7815c55b6d" (UID: "1c06e98a-a64a-424e-b669-1d7815c55b6d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.379936 4919 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c06e98a-a64a-424e-b669-1d7815c55b6d-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.490979 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c06e98a-a64a-424e-b669-1d7815c55b6d" path="/var/lib/kubelet/pods/1c06e98a-a64a-424e-b669-1d7815c55b6d/volumes" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.907867 4919 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-56pqp_must-gather-7hlmv_1c06e98a-a64a-424e-b669-1d7815c55b6d/copy/0.log" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.908945 4919 scope.go:117] "RemoveContainer" containerID="dadb8fa5b9ff0b602f79f6261e442a2f7a2055d6597718df51d88ec0983d81c9" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.909157 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56pqp/must-gather-7hlmv" Mar 10 23:45:51 crc kubenswrapper[4919]: I0310 23:45:51.950729 4919 scope.go:117] "RemoveContainer" containerID="8f34da97e0ebc3975065df1fc1daff70723dcf6dfd96b765745b86a1a50bbd16" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.143046 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553106-vqhjk"] Mar 10 23:46:00 crc kubenswrapper[4919]: E0310 23:46:00.143996 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerName="gather" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.144013 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerName="gather" Mar 10 23:46:00 crc kubenswrapper[4919]: E0310 23:46:00.144034 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerName="copy" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.144042 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerName="copy" Mar 10 23:46:00 crc kubenswrapper[4919]: E0310 23:46:00.144066 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e25d93-9442-40b1-9733-d1e9e03a3e34" containerName="collect-profiles" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.144074 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e25d93-9442-40b1-9733-d1e9e03a3e34" containerName="collect-profiles" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.144253 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerName="gather" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.144277 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c06e98a-a64a-424e-b669-1d7815c55b6d" containerName="copy" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.144290 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e25d93-9442-40b1-9733-d1e9e03a3e34" containerName="collect-profiles" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.144889 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553106-vqhjk" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.147319 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.147460 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.147790 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.150650 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553106-vqhjk"] Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.247096 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw2w4\" (UniqueName: \"kubernetes.io/projected/63befc21-6053-4a79-b72c-b5dc7b45cbd7-kube-api-access-dw2w4\") pod \"auto-csr-approver-29553106-vqhjk\" (UID: \"63befc21-6053-4a79-b72c-b5dc7b45cbd7\") " pod="openshift-infra/auto-csr-approver-29553106-vqhjk" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.348799 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw2w4\" (UniqueName: \"kubernetes.io/projected/63befc21-6053-4a79-b72c-b5dc7b45cbd7-kube-api-access-dw2w4\") pod \"auto-csr-approver-29553106-vqhjk\" (UID: \"63befc21-6053-4a79-b72c-b5dc7b45cbd7\") " pod="openshift-infra/auto-csr-approver-29553106-vqhjk" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.368930 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw2w4\" (UniqueName: \"kubernetes.io/projected/63befc21-6053-4a79-b72c-b5dc7b45cbd7-kube-api-access-dw2w4\") pod \"auto-csr-approver-29553106-vqhjk\" (UID: \"63befc21-6053-4a79-b72c-b5dc7b45cbd7\") " pod="openshift-infra/auto-csr-approver-29553106-vqhjk" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.465297 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553106-vqhjk" Mar 10 23:46:00 crc kubenswrapper[4919]: I0310 23:46:00.984363 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553106-vqhjk"] Mar 10 23:46:01 crc kubenswrapper[4919]: I0310 23:46:01.001957 4919 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 23:46:01 crc kubenswrapper[4919]: I0310 23:46:01.993836 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553106-vqhjk" event={"ID":"63befc21-6053-4a79-b72c-b5dc7b45cbd7","Type":"ContainerStarted","Data":"56e9046abd7c3a6808725a37097c8973427c2c7bd80d865ca0ac91edb680fe9e"} Mar 10 23:46:04 crc kubenswrapper[4919]: I0310 23:46:04.012420 4919 generic.go:334] "Generic (PLEG): container finished" podID="63befc21-6053-4a79-b72c-b5dc7b45cbd7" containerID="35fa445a004be892aef974a9375f0f9d308e083a5cafa852f8b56d5a70ebe6dd" exitCode=0 Mar 10 23:46:04 crc kubenswrapper[4919]: I0310 23:46:04.012505 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553106-vqhjk" event={"ID":"63befc21-6053-4a79-b72c-b5dc7b45cbd7","Type":"ContainerDied","Data":"35fa445a004be892aef974a9375f0f9d308e083a5cafa852f8b56d5a70ebe6dd"} Mar 10 23:46:05 crc kubenswrapper[4919]: I0310 23:46:05.329157 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553106-vqhjk" Mar 10 23:46:05 crc kubenswrapper[4919]: I0310 23:46:05.465767 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw2w4\" (UniqueName: \"kubernetes.io/projected/63befc21-6053-4a79-b72c-b5dc7b45cbd7-kube-api-access-dw2w4\") pod \"63befc21-6053-4a79-b72c-b5dc7b45cbd7\" (UID: \"63befc21-6053-4a79-b72c-b5dc7b45cbd7\") " Mar 10 23:46:05 crc kubenswrapper[4919]: I0310 23:46:05.471029 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63befc21-6053-4a79-b72c-b5dc7b45cbd7-kube-api-access-dw2w4" (OuterVolumeSpecName: "kube-api-access-dw2w4") pod "63befc21-6053-4a79-b72c-b5dc7b45cbd7" (UID: "63befc21-6053-4a79-b72c-b5dc7b45cbd7"). InnerVolumeSpecName "kube-api-access-dw2w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:46:05 crc kubenswrapper[4919]: I0310 23:46:05.568018 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw2w4\" (UniqueName: \"kubernetes.io/projected/63befc21-6053-4a79-b72c-b5dc7b45cbd7-kube-api-access-dw2w4\") on node \"crc\" DevicePath \"\"" Mar 10 23:46:06 crc kubenswrapper[4919]: I0310 23:46:06.031283 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553106-vqhjk" event={"ID":"63befc21-6053-4a79-b72c-b5dc7b45cbd7","Type":"ContainerDied","Data":"56e9046abd7c3a6808725a37097c8973427c2c7bd80d865ca0ac91edb680fe9e"} Mar 10 23:46:06 crc kubenswrapper[4919]: I0310 23:46:06.031325 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e9046abd7c3a6808725a37097c8973427c2c7bd80d865ca0ac91edb680fe9e" Mar 10 23:46:06 crc kubenswrapper[4919]: I0310 23:46:06.031380 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553106-vqhjk" Mar 10 23:46:06 crc kubenswrapper[4919]: I0310 23:46:06.410503 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553100-gztqm"] Mar 10 23:46:06 crc kubenswrapper[4919]: I0310 23:46:06.419963 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553100-gztqm"] Mar 10 23:46:07 crc kubenswrapper[4919]: I0310 23:46:07.490781 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6990c8-a58b-409e-82f8-f4b68997040d" path="/var/lib/kubelet/pods/ee6990c8-a58b-409e-82f8-f4b68997040d/volumes" Mar 10 23:46:29 crc kubenswrapper[4919]: I0310 23:46:29.175750 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:46:29 crc kubenswrapper[4919]: I0310 23:46:29.176531 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:46:49 crc kubenswrapper[4919]: I0310 23:46:49.252604 4919 scope.go:117] "RemoveContainer" containerID="e317c9bf1ce4c45a750933e3dc231c4a1cf4bb881b0792d93d64858f60aee5dc" Mar 10 23:46:59 crc kubenswrapper[4919]: I0310 23:46:59.175845 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:46:59 crc kubenswrapper[4919]: I0310 23:46:59.176408 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:47:20 crc kubenswrapper[4919]: I0310 23:47:20.877574 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7z5pg"] Mar 10 23:47:20 crc kubenswrapper[4919]: E0310 23:47:20.878622 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63befc21-6053-4a79-b72c-b5dc7b45cbd7" containerName="oc" Mar 10 23:47:20 crc kubenswrapper[4919]: I0310 23:47:20.878644 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="63befc21-6053-4a79-b72c-b5dc7b45cbd7" containerName="oc" Mar 10 23:47:20 crc kubenswrapper[4919]: I0310 23:47:20.878967 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="63befc21-6053-4a79-b72c-b5dc7b45cbd7" containerName="oc" Mar 10 23:47:20 crc kubenswrapper[4919]: I0310 23:47:20.881148 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:20 crc kubenswrapper[4919]: I0310 23:47:20.898996 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7z5pg"] Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.051141 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzzc\" (UniqueName: \"kubernetes.io/projected/3247f2cc-799f-46da-94c5-4851b6dcb4ff-kube-api-access-cmzzc\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.051201 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-utilities\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.051319 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-catalog-content\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.153708 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-catalog-content\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.153855 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzzc\" (UniqueName: \"kubernetes.io/projected/3247f2cc-799f-46da-94c5-4851b6dcb4ff-kube-api-access-cmzzc\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.153881 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-utilities\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.154455 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-utilities\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.154768 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-catalog-content\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.176162 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzzc\" (UniqueName: \"kubernetes.io/projected/3247f2cc-799f-46da-94c5-4851b6dcb4ff-kube-api-access-cmzzc\") pod \"community-operators-7z5pg\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.265000 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:21 crc kubenswrapper[4919]: I0310 23:47:21.806848 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7z5pg"] Mar 10 23:47:22 crc kubenswrapper[4919]: I0310 23:47:22.767166 4919 generic.go:334] "Generic (PLEG): container finished" podID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerID="7066ed7fae72139cf0a4bf0649177120a5ab72345ffc362c8058227c5b3b89f0" exitCode=0 Mar 10 23:47:22 crc kubenswrapper[4919]: I0310 23:47:22.767464 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z5pg" event={"ID":"3247f2cc-799f-46da-94c5-4851b6dcb4ff","Type":"ContainerDied","Data":"7066ed7fae72139cf0a4bf0649177120a5ab72345ffc362c8058227c5b3b89f0"} Mar 10 23:47:22 crc kubenswrapper[4919]: I0310 23:47:22.767570 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z5pg" event={"ID":"3247f2cc-799f-46da-94c5-4851b6dcb4ff","Type":"ContainerStarted","Data":"03e3b3ce63f2b738b818e7e28885b5d6b42df7c50ebbbc8a7717632cf25b6b39"} Mar 10 23:47:23 crc kubenswrapper[4919]: I0310 23:47:23.786348 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z5pg" event={"ID":"3247f2cc-799f-46da-94c5-4851b6dcb4ff","Type":"ContainerStarted","Data":"214c81e01b17a6aa0b0a977d7bfe144f35e3ea0d96528403bde0b3494286c685"} Mar 10 23:47:24 crc kubenswrapper[4919]: I0310 23:47:24.798360 4919 generic.go:334] "Generic (PLEG): container finished" podID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerID="214c81e01b17a6aa0b0a977d7bfe144f35e3ea0d96528403bde0b3494286c685" exitCode=0 Mar 10 23:47:24 crc kubenswrapper[4919]: I0310 23:47:24.798443 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z5pg" event={"ID":"3247f2cc-799f-46da-94c5-4851b6dcb4ff","Type":"ContainerDied","Data":"214c81e01b17a6aa0b0a977d7bfe144f35e3ea0d96528403bde0b3494286c685"} Mar 10 23:47:25 crc kubenswrapper[4919]: I0310 23:47:25.806926 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z5pg" event={"ID":"3247f2cc-799f-46da-94c5-4851b6dcb4ff","Type":"ContainerStarted","Data":"c86e0f92de9e1240da13612dea5985511ef4c5e2ae95641daf41e15c6d7d1627"} Mar 10 23:47:25 crc kubenswrapper[4919]: I0310 23:47:25.826746 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7z5pg" podStartSLOduration=3.356588122 podStartE2EDuration="5.826727666s" podCreationTimestamp="2026-03-10 23:47:20 +0000 UTC" firstStartedPulling="2026-03-10 23:47:22.769639405 +0000 UTC m=+7030.011520013" lastFinishedPulling="2026-03-10 23:47:25.239778939 +0000 UTC m=+7032.481659557" observedRunningTime="2026-03-10 23:47:25.82428591 +0000 UTC m=+7033.066166538" watchObservedRunningTime="2026-03-10 23:47:25.826727666 +0000 UTC m=+7033.068608264" Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.175597 4919 patch_prober.go:28] interesting pod/machine-config-daemon-z7v4t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.175685 4919 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.175771 4919 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.177005 4919 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d"} pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.177132 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" containerName="machine-config-daemon" containerID="cri-o://944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" gracePeriod=600 Mar 10 23:47:29 crc kubenswrapper[4919]: E0310 23:47:29.308634 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.842828 4919 generic.go:334] "Generic (PLEG): container finished" podID="566678d1-f416-4116-ab20-b30dceb86cdc" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" exitCode=0 Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.842893 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" event={"ID":"566678d1-f416-4116-ab20-b30dceb86cdc","Type":"ContainerDied","Data":"944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d"} Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.842939 4919 scope.go:117] "RemoveContainer" containerID="e18b7db8c70c24109d95f2371d5b652db18ddf5ac55b9da74904e0d7d46a12e3" Mar 10 23:47:29 crc kubenswrapper[4919]: I0310 23:47:29.844382 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:47:29 crc kubenswrapper[4919]: E0310 23:47:29.844860 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:47:31 crc kubenswrapper[4919]: I0310 23:47:31.266195 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:31 crc kubenswrapper[4919]: I0310 23:47:31.266275 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:31 crc kubenswrapper[4919]: I0310 23:47:31.308463 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:31 crc kubenswrapper[4919]: I0310 23:47:31.908842 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:31 crc kubenswrapper[4919]: I0310 23:47:31.951612 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7z5pg"] Mar 10 23:47:33 crc kubenswrapper[4919]: I0310 23:47:33.879972 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7z5pg" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerName="registry-server" containerID="cri-o://c86e0f92de9e1240da13612dea5985511ef4c5e2ae95641daf41e15c6d7d1627" gracePeriod=2 Mar 10 23:47:34 crc kubenswrapper[4919]: I0310 23:47:34.891927 4919 generic.go:334] "Generic (PLEG): container finished" podID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerID="c86e0f92de9e1240da13612dea5985511ef4c5e2ae95641daf41e15c6d7d1627" exitCode=0 Mar 10 23:47:34 crc kubenswrapper[4919]: I0310 23:47:34.891976 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z5pg" event={"ID":"3247f2cc-799f-46da-94c5-4851b6dcb4ff","Type":"ContainerDied","Data":"c86e0f92de9e1240da13612dea5985511ef4c5e2ae95641daf41e15c6d7d1627"} Mar 10 23:47:34 crc kubenswrapper[4919]: I0310 23:47:34.892006 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z5pg" event={"ID":"3247f2cc-799f-46da-94c5-4851b6dcb4ff","Type":"ContainerDied","Data":"03e3b3ce63f2b738b818e7e28885b5d6b42df7c50ebbbc8a7717632cf25b6b39"} Mar 10 23:47:34 crc kubenswrapper[4919]: I0310 23:47:34.892022 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e3b3ce63f2b738b818e7e28885b5d6b42df7c50ebbbc8a7717632cf25b6b39" Mar 10 23:47:34 crc kubenswrapper[4919]: I0310 23:47:34.903508 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.022507 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmzzc\" (UniqueName: \"kubernetes.io/projected/3247f2cc-799f-46da-94c5-4851b6dcb4ff-kube-api-access-cmzzc\") pod \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.022754 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-catalog-content\") pod \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.022808 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-utilities\") pod \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\" (UID: \"3247f2cc-799f-46da-94c5-4851b6dcb4ff\") " Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.025071 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-utilities" (OuterVolumeSpecName: "utilities") pod "3247f2cc-799f-46da-94c5-4851b6dcb4ff" (UID: "3247f2cc-799f-46da-94c5-4851b6dcb4ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.030535 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3247f2cc-799f-46da-94c5-4851b6dcb4ff-kube-api-access-cmzzc" (OuterVolumeSpecName: "kube-api-access-cmzzc") pod "3247f2cc-799f-46da-94c5-4851b6dcb4ff" (UID: "3247f2cc-799f-46da-94c5-4851b6dcb4ff"). InnerVolumeSpecName "kube-api-access-cmzzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.091724 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3247f2cc-799f-46da-94c5-4851b6dcb4ff" (UID: "3247f2cc-799f-46da-94c5-4851b6dcb4ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.125532 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmzzc\" (UniqueName: \"kubernetes.io/projected/3247f2cc-799f-46da-94c5-4851b6dcb4ff-kube-api-access-cmzzc\") on node \"crc\" DevicePath \"\"" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.125582 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.125602 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3247f2cc-799f-46da-94c5-4851b6dcb4ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.899375 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7z5pg" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.928780 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7z5pg"] Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.940790 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7z5pg"] Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.951112 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vhgc2"] Mar 10 23:47:35 crc kubenswrapper[4919]: E0310 23:47:35.951521 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerName="registry-server" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.951536 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerName="registry-server" Mar 10 23:47:35 crc kubenswrapper[4919]: E0310 23:47:35.951563 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerName="extract-utilities" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.951571 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerName="extract-utilities" Mar 10 23:47:35 crc kubenswrapper[4919]: E0310 23:47:35.951586 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerName="extract-content" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.951592 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerName="extract-content" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.951749 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" containerName="registry-server" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.952974 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:35 crc kubenswrapper[4919]: I0310 23:47:35.962589 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhgc2"] Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.039841 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wqbx\" (UniqueName: \"kubernetes.io/projected/208e7e04-08dd-4481-98b4-8311a17c9322-kube-api-access-2wqbx\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.039901 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-utilities\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.039941 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-catalog-content\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.141066 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wqbx\" (UniqueName: \"kubernetes.io/projected/208e7e04-08dd-4481-98b4-8311a17c9322-kube-api-access-2wqbx\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.141130 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-utilities\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.141230 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-catalog-content\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.141690 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-catalog-content\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.142122 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-utilities\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.161520 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wqbx\" (UniqueName: \"kubernetes.io/projected/208e7e04-08dd-4481-98b4-8311a17c9322-kube-api-access-2wqbx\") pod \"redhat-marketplace-vhgc2\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.333880 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:36 crc kubenswrapper[4919]: I0310 23:47:36.957642 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhgc2"] Mar 10 23:47:37 crc kubenswrapper[4919]: I0310 23:47:37.499164 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3247f2cc-799f-46da-94c5-4851b6dcb4ff" path="/var/lib/kubelet/pods/3247f2cc-799f-46da-94c5-4851b6dcb4ff/volumes" Mar 10 23:47:37 crc kubenswrapper[4919]: I0310 23:47:37.915057 4919 generic.go:334] "Generic (PLEG): container finished" podID="208e7e04-08dd-4481-98b4-8311a17c9322" containerID="779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5" exitCode=0 Mar 10 23:47:37 crc kubenswrapper[4919]: I0310 23:47:37.915172 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhgc2" event={"ID":"208e7e04-08dd-4481-98b4-8311a17c9322","Type":"ContainerDied","Data":"779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5"} Mar 10 23:47:37 crc kubenswrapper[4919]: I0310 23:47:37.915213 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhgc2" event={"ID":"208e7e04-08dd-4481-98b4-8311a17c9322","Type":"ContainerStarted","Data":"ae45d64ddff8616208a537c313940d6cef0e7660dff71ac051b3a9b0e4f8c7f0"} Mar 10 23:47:38 crc kubenswrapper[4919]: I0310 23:47:38.923452 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhgc2" event={"ID":"208e7e04-08dd-4481-98b4-8311a17c9322","Type":"ContainerStarted","Data":"7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547"} Mar 10 23:47:39 crc kubenswrapper[4919]: I0310 23:47:39.935905 4919 generic.go:334] "Generic (PLEG): container finished" podID="208e7e04-08dd-4481-98b4-8311a17c9322" containerID="7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547" exitCode=0 Mar 10 23:47:39 crc kubenswrapper[4919]: I0310 23:47:39.935958 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhgc2" event={"ID":"208e7e04-08dd-4481-98b4-8311a17c9322","Type":"ContainerDied","Data":"7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547"} Mar 10 23:47:40 crc kubenswrapper[4919]: I0310 23:47:40.950308 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhgc2" event={"ID":"208e7e04-08dd-4481-98b4-8311a17c9322","Type":"ContainerStarted","Data":"fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9"} Mar 10 23:47:40 crc kubenswrapper[4919]: I0310 23:47:40.984307 4919 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vhgc2" podStartSLOduration=3.514967 podStartE2EDuration="5.984285482s" podCreationTimestamp="2026-03-10 23:47:35 +0000 UTC" firstStartedPulling="2026-03-10 23:47:37.917925609 +0000 UTC m=+7045.159806217" lastFinishedPulling="2026-03-10 23:47:40.387244081 +0000 UTC m=+7047.629124699" observedRunningTime="2026-03-10 23:47:40.973583051 +0000 UTC m=+7048.215463699" watchObservedRunningTime="2026-03-10 23:47:40.984285482 +0000 UTC m=+7048.226166100" Mar 10 23:47:44 crc kubenswrapper[4919]: I0310 23:47:44.480108 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:47:44 crc kubenswrapper[4919]: E0310 23:47:44.481026 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:47:46 crc kubenswrapper[4919]: I0310 23:47:46.334503 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:46 crc kubenswrapper[4919]: I0310 23:47:46.334586 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:46 crc kubenswrapper[4919]: I0310 23:47:46.412807 4919 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:47 crc kubenswrapper[4919]: I0310 23:47:47.053938 4919 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:47 crc kubenswrapper[4919]: I0310 23:47:47.108277 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhgc2"] Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.031907 4919 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vhgc2" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" containerName="registry-server" containerID="cri-o://fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9" gracePeriod=2 Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.351690 4919 scope.go:117] "RemoveContainer" containerID="6759152db79246c8f366a301c0862779647c325434096936d8dc75c2382c30ec" Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.475076 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.580251 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-catalog-content\") pod \"208e7e04-08dd-4481-98b4-8311a17c9322\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.580479 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-utilities\") pod \"208e7e04-08dd-4481-98b4-8311a17c9322\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.580592 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wqbx\" (UniqueName: \"kubernetes.io/projected/208e7e04-08dd-4481-98b4-8311a17c9322-kube-api-access-2wqbx\") pod \"208e7e04-08dd-4481-98b4-8311a17c9322\" (UID: \"208e7e04-08dd-4481-98b4-8311a17c9322\") " Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.581121 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-utilities" (OuterVolumeSpecName: "utilities") pod "208e7e04-08dd-4481-98b4-8311a17c9322" (UID: "208e7e04-08dd-4481-98b4-8311a17c9322"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.586034 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208e7e04-08dd-4481-98b4-8311a17c9322-kube-api-access-2wqbx" (OuterVolumeSpecName: "kube-api-access-2wqbx") pod "208e7e04-08dd-4481-98b4-8311a17c9322" (UID: "208e7e04-08dd-4481-98b4-8311a17c9322"). InnerVolumeSpecName "kube-api-access-2wqbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.605676 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "208e7e04-08dd-4481-98b4-8311a17c9322" (UID: "208e7e04-08dd-4481-98b4-8311a17c9322"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.682865 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wqbx\" (UniqueName: \"kubernetes.io/projected/208e7e04-08dd-4481-98b4-8311a17c9322-kube-api-access-2wqbx\") on node \"crc\" DevicePath \"\"" Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.682898 4919 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 23:47:49 crc kubenswrapper[4919]: I0310 23:47:49.682908 4919 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/208e7e04-08dd-4481-98b4-8311a17c9322-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.043747 4919 generic.go:334] "Generic (PLEG): container finished" podID="208e7e04-08dd-4481-98b4-8311a17c9322" containerID="fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9" exitCode=0 Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.043795 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhgc2" event={"ID":"208e7e04-08dd-4481-98b4-8311a17c9322","Type":"ContainerDied","Data":"fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9"} Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.043823 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhgc2" event={"ID":"208e7e04-08dd-4481-98b4-8311a17c9322","Type":"ContainerDied","Data":"ae45d64ddff8616208a537c313940d6cef0e7660dff71ac051b3a9b0e4f8c7f0"} Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.043841 4919 scope.go:117] "RemoveContainer" containerID="fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.043955 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhgc2" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.065248 4919 scope.go:117] "RemoveContainer" containerID="7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.099564 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhgc2"] Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.109522 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhgc2"] Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.124715 4919 scope.go:117] "RemoveContainer" containerID="779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.144873 4919 scope.go:117] "RemoveContainer" containerID="fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9" Mar 10 23:47:50 crc kubenswrapper[4919]: E0310 23:47:50.145308 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9\": container with ID starting with fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9 not found: ID does not exist" containerID="fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.145354 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9"} err="failed to get container status \"fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9\": rpc error: code = NotFound desc = could not find container \"fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9\": container with ID starting with fbbd1246d2653f3b967af4e061d20dc14399882226a15b7ee8a0a24cfb08dbb9 not found: ID does not exist" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.145381 4919 scope.go:117] "RemoveContainer" containerID="7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547" Mar 10 23:47:50 crc kubenswrapper[4919]: E0310 23:47:50.146102 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547\": container with ID starting with 7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547 not found: ID does not exist" containerID="7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.146145 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547"} err="failed to get container status \"7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547\": rpc error: code = NotFound desc = could not find container \"7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547\": container with ID starting with 7cb4878e265815e29beff9f426a2249479662109f0553e2ec80544334ca89547 not found: ID does not exist" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.146175 4919 scope.go:117] "RemoveContainer" containerID="779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5" Mar 10 23:47:50 crc kubenswrapper[4919]: E0310 23:47:50.146620 4919 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5\": container with ID starting with 779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5 not found: ID does not exist" containerID="779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5" Mar 10 23:47:50 crc kubenswrapper[4919]: I0310 23:47:50.146664 4919 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5"} err="failed to get container status \"779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5\": rpc error: code = NotFound desc = could not find container \"779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5\": container with ID starting with 779d8ed74166f2c9ebac07ce82c2683db87fadb2cf224eaa7a29c395c35827c5 not found: ID does not exist" Mar 10 23:47:51 crc kubenswrapper[4919]: I0310 23:47:51.496513 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" path="/var/lib/kubelet/pods/208e7e04-08dd-4481-98b4-8311a17c9322/volumes" Mar 10 23:47:57 crc kubenswrapper[4919]: I0310 23:47:57.480770 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:47:57 crc kubenswrapper[4919]: E0310 23:47:57.481706 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.149131 4919 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553108-rs67s"] Mar 10 23:48:00 crc kubenswrapper[4919]: E0310 23:48:00.150124 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" containerName="extract-utilities" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.150148 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" containerName="extract-utilities" Mar 10 23:48:00 crc kubenswrapper[4919]: E0310 23:48:00.150177 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" containerName="extract-content" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.150189 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" containerName="extract-content" Mar 10 23:48:00 crc kubenswrapper[4919]: E0310 23:48:00.150211 4919 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" containerName="registry-server" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.150226 4919 state_mem.go:107] "Deleted CPUSet assignment" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" containerName="registry-server" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.150641 4919 memory_manager.go:354] "RemoveStaleState removing state" podUID="208e7e04-08dd-4481-98b4-8311a17c9322" containerName="registry-server" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.151550 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553108-rs67s" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.154823 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.155009 4919 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.154841 4919 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wbvtv" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.157543 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553108-rs67s"] Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.291526 4919 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdtbs\" (UniqueName: \"kubernetes.io/projected/267af9fe-2fb0-4b24-8af6-04b033099eee-kube-api-access-vdtbs\") pod \"auto-csr-approver-29553108-rs67s\" (UID: \"267af9fe-2fb0-4b24-8af6-04b033099eee\") " pod="openshift-infra/auto-csr-approver-29553108-rs67s" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.393546 4919 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdtbs\" (UniqueName: \"kubernetes.io/projected/267af9fe-2fb0-4b24-8af6-04b033099eee-kube-api-access-vdtbs\") pod \"auto-csr-approver-29553108-rs67s\" (UID: \"267af9fe-2fb0-4b24-8af6-04b033099eee\") " pod="openshift-infra/auto-csr-approver-29553108-rs67s" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.424609 4919 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdtbs\" (UniqueName: \"kubernetes.io/projected/267af9fe-2fb0-4b24-8af6-04b033099eee-kube-api-access-vdtbs\") pod \"auto-csr-approver-29553108-rs67s\" (UID: \"267af9fe-2fb0-4b24-8af6-04b033099eee\") " pod="openshift-infra/auto-csr-approver-29553108-rs67s" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.480911 4919 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553108-rs67s" Mar 10 23:48:00 crc kubenswrapper[4919]: I0310 23:48:00.959515 4919 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553108-rs67s"] Mar 10 23:48:00 crc kubenswrapper[4919]: W0310 23:48:00.966978 4919 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod267af9fe_2fb0_4b24_8af6_04b033099eee.slice/crio-bcaa521bd4fb55f0b7ab7e387cb18f1f5e7219ee76e1b2f4af60b5f7589f062a WatchSource:0}: Error finding container bcaa521bd4fb55f0b7ab7e387cb18f1f5e7219ee76e1b2f4af60b5f7589f062a: Status 404 returned error can't find the container with id bcaa521bd4fb55f0b7ab7e387cb18f1f5e7219ee76e1b2f4af60b5f7589f062a Mar 10 23:48:01 crc kubenswrapper[4919]: I0310 23:48:01.138674 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553108-rs67s" event={"ID":"267af9fe-2fb0-4b24-8af6-04b033099eee","Type":"ContainerStarted","Data":"bcaa521bd4fb55f0b7ab7e387cb18f1f5e7219ee76e1b2f4af60b5f7589f062a"} Mar 10 23:48:03 crc kubenswrapper[4919]: I0310 23:48:03.166244 4919 generic.go:334] "Generic (PLEG): container finished" podID="267af9fe-2fb0-4b24-8af6-04b033099eee" containerID="1724810df020477776b56b58894dcea56c078f16cb2ba1fea3eea3b06c8745da" exitCode=0 Mar 10 23:48:03 crc kubenswrapper[4919]: I0310 23:48:03.166431 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553108-rs67s" event={"ID":"267af9fe-2fb0-4b24-8af6-04b033099eee","Type":"ContainerDied","Data":"1724810df020477776b56b58894dcea56c078f16cb2ba1fea3eea3b06c8745da"} Mar 10 23:48:04 crc kubenswrapper[4919]: I0310 23:48:04.550963 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553108-rs67s" Mar 10 23:48:04 crc kubenswrapper[4919]: I0310 23:48:04.668032 4919 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdtbs\" (UniqueName: \"kubernetes.io/projected/267af9fe-2fb0-4b24-8af6-04b033099eee-kube-api-access-vdtbs\") pod \"267af9fe-2fb0-4b24-8af6-04b033099eee\" (UID: \"267af9fe-2fb0-4b24-8af6-04b033099eee\") " Mar 10 23:48:04 crc kubenswrapper[4919]: I0310 23:48:04.686011 4919 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267af9fe-2fb0-4b24-8af6-04b033099eee-kube-api-access-vdtbs" (OuterVolumeSpecName: "kube-api-access-vdtbs") pod "267af9fe-2fb0-4b24-8af6-04b033099eee" (UID: "267af9fe-2fb0-4b24-8af6-04b033099eee"). InnerVolumeSpecName "kube-api-access-vdtbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 23:48:04 crc kubenswrapper[4919]: I0310 23:48:04.770062 4919 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdtbs\" (UniqueName: \"kubernetes.io/projected/267af9fe-2fb0-4b24-8af6-04b033099eee-kube-api-access-vdtbs\") on node \"crc\" DevicePath \"\"" Mar 10 23:48:05 crc kubenswrapper[4919]: I0310 23:48:05.185771 4919 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553108-rs67s" event={"ID":"267af9fe-2fb0-4b24-8af6-04b033099eee","Type":"ContainerDied","Data":"bcaa521bd4fb55f0b7ab7e387cb18f1f5e7219ee76e1b2f4af60b5f7589f062a"} Mar 10 23:48:05 crc kubenswrapper[4919]: I0310 23:48:05.185822 4919 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcaa521bd4fb55f0b7ab7e387cb18f1f5e7219ee76e1b2f4af60b5f7589f062a" Mar 10 23:48:05 crc kubenswrapper[4919]: I0310 23:48:05.185822 4919 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553108-rs67s" Mar 10 23:48:05 crc kubenswrapper[4919]: I0310 23:48:05.651301 4919 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553102-cg25b"] Mar 10 23:48:05 crc kubenswrapper[4919]: I0310 23:48:05.659223 4919 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553102-cg25b"] Mar 10 23:48:07 crc kubenswrapper[4919]: I0310 23:48:07.491740 4919 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5c6cdf-ce81-4b26-92f3-9131bcdcd530" path="/var/lib/kubelet/pods/ac5c6cdf-ce81-4b26-92f3-9131bcdcd530/volumes" Mar 10 23:48:08 crc kubenswrapper[4919]: I0310 23:48:08.480217 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:48:08 crc kubenswrapper[4919]: E0310 23:48:08.480881 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:48:20 crc kubenswrapper[4919]: I0310 23:48:20.480793 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:48:20 crc kubenswrapper[4919]: E0310 23:48:20.481733 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:48:31 crc kubenswrapper[4919]: I0310 23:48:31.480441 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:48:31 crc kubenswrapper[4919]: E0310 23:48:31.481597 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:48:43 crc kubenswrapper[4919]: I0310 23:48:43.488820 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:48:43 crc kubenswrapper[4919]: E0310 23:48:43.490553 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:48:49 crc kubenswrapper[4919]: I0310 23:48:49.403499 4919 scope.go:117] "RemoveContainer" containerID="16d2207425b535a91738629ca4c5c321f46a4159bb611815693f68a0cdd17d10" Mar 10 23:48:54 crc kubenswrapper[4919]: I0310 23:48:54.480129 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:48:54 crc kubenswrapper[4919]: E0310 23:48:54.480790 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:49:05 crc kubenswrapper[4919]: I0310 23:49:05.480702 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:49:05 crc kubenswrapper[4919]: E0310 23:49:05.481382 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" Mar 10 23:49:20 crc kubenswrapper[4919]: I0310 23:49:20.479927 4919 scope.go:117] "RemoveContainer" containerID="944b6d38f8d0a4bee7702dc5e2dfa1edc051685f37ea597c38990dabab74480d" Mar 10 23:49:20 crc kubenswrapper[4919]: E0310 23:49:20.480832 4919 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z7v4t_openshift-machine-config-operator(566678d1-f416-4116-ab20-b30dceb86cdc)\"" pod="openshift-machine-config-operator/machine-config-daemon-z7v4t" podUID="566678d1-f416-4116-ab20-b30dceb86cdc" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515154127020024442 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015154127021017360 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015154110455016506 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015154110456015457 5ustar corecore